This white paper discusses capturing business requirements for scorecards, dashboards, and reports. It defines the scope of information needed, including the report purpose, measures, dimensions, hierarchies, time periods, and other functional requirements. It also covers non-functional requirements like volume and capacity, performance, availability, and security. Further analysis is then needed to check data availability, prioritize requirements, define validation rules, and design supporting processes.
Capturing Business Requirements For Scorecards, Dashboards And Reports
1. JULIAN RAINS White Paper
Capturing business requirements for
scorecards, dashboards and reports
September 2010
2. JULIAN RAINS White Paper
September 2010
Synopsis.
This paper helps Management Information and Business Intelligence related projects
build a solid foundation for their reporting business requirements gathering. It defines the
scope of the information needed to design and build dashboards, scorecards and other
types of report. The paper considers the requirements of multiple stakeholders and
describes the business analysis activities following the initial requirements draft.
Although geared towards project-level requirements, the same types of information
should be captured at the programme-level (albeit at a broader and less-detailed level)
when assessing the entire reporting landscape to agree scope and priorities with
sponsors. All content and opinions are solely drawn from the author’s personal
experience and no liability will be accepted.
Requirements are important.
High quality requirements are fundamental to the success of IT-related initiatives. They
are the foundation upon which projects build or adjust systems, processes and teams.
They are one of the first tasks and are widely used in every stage of a project’s lifecycle.
They must be done quickly and blend the right levels of breadth, depth and accuracy.
Capturing what the report is supposed to do.
With a structured approach, establishing the functional (what the solution must “do”)
requirements for reporting need not be difficult. Some examples are given in this section
using a fictitious company called Alpha Group Ltd.
Report Purpose. Will the report deliver operational, tactical or strategic information?
Operational reports deliver frequent data to alert controllers to operational exceptions.
Tactical reports help to manage particular processes or departments. Strategic
information shows the enterprise-level progress against its strategic goals. Further, it’s
tempting to assess only the reports needed by a Manager or Executive but don’t forget
the data needs of Testers or Analysts tasked with understanding or verify the information
presented in the management’s report.
Measures (aka Metrics and Key Performance Indicators). What information is
presented in the report and how is each measure calculated (from its source data, see
Data Definitions below)? For example “Actual Average Number of Staff” might be
calculated, for any given year, by counting the actual staff numbers at the end of each
month and dividing by the number of months counted.
Dimensions. How will each measure be qualified? For example users might want to
view the actual, planned and forecasted “Actual Average Number of Staff” by month for
the current and prior years, by company, by department and by staff type. The
“members” or “attributes” of each dimension must also be identified e.g. staff type may
have 3 members … Permanent, Temporary and Contractor.
Page 2 of 6 Capturing business requirements for scorecards, dashboards and reports
3. JULIAN RAINS White Paper
September 2010
Hierarchies. For each dimension, what is the granular (lowest) level of data required for
the report and how should it be aggregated? For example:
• Department – each staff member is assigned to one department only i.e. Sales,
Marketing, Production, Finance, HR or Distribution. Aggregated, these make the total
staff numbers for the Group.
• Company – each staff member is assigned to only one company subsidiary
Aggregated, these make the total staff numbers for the Group.
• Staff Type – each staff member is assigned to one staff category i.e. Permanent,
Temporary or Contractor. Aggregated, these make the total staff numbers for the
Group.
Time. What relative time periods are needed for comparison? For example, should the
report show the month’s performance for both the prior-year and current-year?
Consider using the simple matrix like the one shown in Figure 1 to record the
relationship between measures, dimensions and granularity.
Figure 1
Dimension Time
Staff Current Prior
Name Unit Department Company
Type Year Year
Weekly Monthly
Actual Average
Count
No of Staff
Measure
Planned Avg No
Count
of Staff
Forecast Avg No
Count
of Staff
Frequency (aka Latency). How often are the reports needed? For example information
might be needed each month and to be available by the 2nd day of the following month.
Changing Dimensions. The “members” or “attributes” of dimensions are not always
fixed. For example a “Month” dimension will always contain the same 12 members
(January, February etc). However, the Staff Type dimension’s members might reduce
from “Permanent, Temporary or Contract” to “Permanent or Non-Permanent”. For those
that will change, it’s important to know how users want to report the measures up to the
point that a dimension’s member changed. Users might want to view all historic “Actual
Average Number of Staff” information using the new members i.e. as either “Permanent
or Non-Permanent”; alternatively they may want to preserve the historic data which
shows whether a staff member was “Permanent, Temporary or Contract”; finally they
might ask for a hybrid solution giving the option to report the historic data using either
the old or new attributes.
Page 3 of 6 Capturing business requirements for scorecards, dashboards and reports
4. JULIAN RAINS White Paper
September 2010
Data Sources. What is the location of the source (fact, reference) data needed to
produce the reports? In a perfect world all data will be available from the organisation’s
data warehouse or from an appropriate data mart, although it is not unusual to have to
bring data together from multiple locations, including manual sources and those outside
the organisation.
Data Definitions. What is the definition of the source (fact, reference) data needed to
produce the reports? Clear definitions ensure all parties understand what’s required,
reduces the likelihood of misunderstanding and contributes significantly to the meta data
dictionary.
Drill Through. Do users need to view information at different levels? For example they
may wish to view the “Actual Average Number of Staff” up and down the Department,
Company and Staff Type hierarchies.
Navigation. What links between reports are necessary? There maybe a logical
progression from one report to another, for which a useful link would improve usability.
Presentation. How does the user wants to view the information e.g. as a cross-tab
and/or graphically with graphs, pie charts and traffic lights? What should each unique
page of the report look like i.e. its dimensions and measures, titles and headings, units
of measure, decimal places, graph ranges, traffic lights, colours, fonts etc? What are the
rules for any dynamic graphics such as traffic lights? Does the report need to include a
particular brand or style to reinforce its corporate identify or data origin?
Commentary. Do users need to add free text commentary to the reports? Describe
where and how much commentary will be added, by which users and when during the
reporting cycle.
Report Authoring. Do users need to custom-build their own reports, using the
measures and dimensions available in the system?
Output. How and where do users need to access their reports? For example online via
an intranet / internet browser and/or in a document such as PDF or MS PowerPoint. Is
there a corporate information portal into which reports should be embedded?
User Groups. How does access to the information need to be controlled? For example:
• There could be a user group for each of Beta, Gamma and Delta companies to which
users are assigned to one or more. Only users with access to all subsidiaries are
able to also view the consolidated Alpha Group Ltd report.
• There could be user groups relating to the way in which the report is used e.g.
Report Viewer and Report Commentator where users are be assigned to only one or
both, depending on their role.
Access. Will the reports be used internally i.e. only by the organisation’s staff or
externally e.g. by suppliers, customers, shareholders or the general public?
Page 4 of 6 Capturing business requirements for scorecards, dashboards and reports
5. JULIAN RAINS White Paper
September 2010
Reconciliation. Are controls needed to ensure that the data included in the report has
been checked against an independent data source of assured high-quality?
Validation. Is functionality required to assure the quality of data using business rules?
For example checking that postcodes always follow a defined format or that a number is
always in a particular range. What are these business rules?
Capturing how the solution is supposed to do it.
This section aims to establish the non-functional (what the solution must “be”)
requirements by asking questions related to the known constraints e.g. number of users
and to the expected quality-levels e.g. response time.
Volume and Capacity. What is the likely volume of data to be stored? State how many
users will access the reports and when. Estimate the maximum number of concurrent
users.
Growth. Will the data and/or user volumes grow? State all growth assumptions.
Performance. How quickly does the solution’s functionality need to respond to its users’
requests?
Availability and Accessibility. When and where does the report need to be available
and what are the users’ critical days and/or hours for reporting and dealing with any
issues?
Backup / Business Continuity. How quickly, in the event of a failure, does the system
need to be restored?
Compliance. What is required to adhere to internal policies or externally-imposed
regulations e.g. Data Protection, Sarbanes Oxley or Solvency II?
Audit. Is an audit trail of adjustments made to (fact, reference or meta) data required?
Should the system capture information on who’s using it?
Security. What are the requirements related to granting, updating and removing access
to the reports? Note that many organisations will have their own Information Security
policies.
Usability. Consider how easy to use the solution must be considering the users’
expertise and familiarity with reporting technology, their roles and the time they have
available.
Page 5 of 6 Capturing business requirements for scorecards, dashboards and reports
6. JULIAN RAINS White Paper
September 2010
What’s next.
After the Requirements Definition draft, more analysis work is usually needed to add
sufficient detail for those responsible for detailed design, build, test and implementation.
Data Availability. Checks that the data needed actually exist at the required granularity,
completeness and frequency should be made as early as possible. Systematic
identification of each item of source data, expressed in terms of databases, tables and
fields, will achieve this goal.
Functional Availability. Requirements must prioritised, for example, using a recognised
classification such as MoSCoW. It’s important to check that the current or proposed
solution can actually deliver at least the users’ most highly-valued requirements.
Data Definitions. Further detail to define how the solution must calculate each Measure.
This information will contribute significantly to the meta data dictionary.
Validation Rules. Confirm the business rules needed to assure the quality of any
particular items of data. Note that these rules are often defined as part of initial data
cleansing initiatives but can also be built into the “business as usual” data processing in
order to sustain high quality data.
Process and Organisational Design. Confirm the supporting processes needed for the
reporting to operate normally. Processes impacted by new reporting initiatives can be
wide-ranging but examples include:
• Manage report access i.e. granting, updating and removing user access permissions
• Manage report production and distribution i.e. handling data validation and/or control
exceptions, producing the report and distributing to its users.
• Manage queries i.e. how users can raise support issues the response handling.
Prioritise.
Don’t expect to get all the requirements stable in one go – they’ll further evolve during
the detailed design, development and even testing phases of the project. But getting
early agreement on both the scope of the requirements to capture and their relative
priority will provide a solid foundation.
About JULIAN RAINS.
Julian Rains is a freelance consultant and contractor working in the UK. He is an
experienced Project Manager and Business Analyst and has run Management
Information and Business Intelligence projects at major international companies in the
Financial Services and Energy industries.
Page 6 of 6 Capturing business requirements for scorecards, dashboards and reports