Without… user-centred design approaches, information requirements from users may not be well understood. Thus, there is limited evidence about the dashboard acceptance, frequency of use or whether dashboards satisfactorily met the needs of intended users.
- Murphy DR, Savoy A, Satterly T, et al
Taken from: Dashboards for visual display of patient safety data: a systematic review BMJ Health & Care Informatics 2021
Working with an NHS region and a provider, SCW carried out user research with leadership teams (Executive and Board level) to understand how people use performance data to inform strategic and tactical decision-making, and whether their needs are met.
In line with NHS and GDS approaches, we first brought together multi-disciplinary teams incorporating user researchers, data analysts, data visualisation specialists, designers and developers.
Discovery phase
Each team started by framing the problems it needed to solve, and who it was solving them for. This involved a review of current dashboards and performance reports, supported by contextual research and stakeholder interviews to understand the different types of users and roles.
This identified user needs and areas each team wanted to explore, such as data freshness, data quality and predictive modelling. Teams then created quantitative surveys and discussion guides for semi-structured interviews designed to capture these insights.
Towards the end of this phase, the teams analysed and synthesised their research into key findings. These included:
- a lack of trust in the quality of performance data and analysis
- lengthy, complex reports which provide so much data it can be hard to focus on key priorities
- data which isn’t ‘actionable’ because causality and correlation aren’t immediately obvious.
Some user needs were common across regions, systems and providers:
- Leaders need to feel they are ‘on the same page’ with a single, trusted source of truth which ‘tells a story’, regardless of its users’ data literacy.
- Leaders need an appropriate level of detail; they are generally happy to omit less relevant data from zoomed-out views to keep presentation simple, clear and consistent.
- Leaders need the freshest available data for it to be ‘actionable’; most respondents agreed that validated historical data is important for reporting purposes, but typically reports on what has happened, rather than why.
As well as common user needs, the research identified specific needs for different types of users. The teams created personas (composite ‘pen portraits’ of these differentiated users and their needs) to support future solution design.
For example, some leaders said they need a ‘breadth’ view across a whole region or organisation to understand cause and effect. Others said they need a ‘depth’ view – focusing in on a specific domain or area (such as workforce) and comparing different areas’ performance.
This goes some way to explain why system and regional leadership teams can often struggle to articulate their needs for performance data. For BI teams, this can be incredibly frustrating when a huge amount of effort goes into creating dashboards and reports that are either not used or never ‘quite right’.
Alpha phase
Both teams felt that a balanced scorecard could provide a single source of truth, allowing leaders to get a holistic view across regions, systems and localities and then drill down into key metric areas to understand the causes of improving or concerning performance. To meet user needs, the scorecard would need to be easy to use, trusted and actionable.
The teams took a co-design approach, using ‘disposable’ rapid prototypes to test a range of approaches with users and discard any that didn’t meet users’ needs.
The co-design process started with conceptual models, leading to simple then detailed wireframes which were tested and iterated through user research.
The teams hadn’t originally expected to develop high-fidelity prototypes but it quickly became clear that visual presentation was really important to users; helping build trust and ‘opening up’ performance data.
Consequently, both teams pivoted onto developing clickable HTML prototypes using a combination of real and realistic data.
BI teams worked with interaction designers to identify the simplest possible metrics and presentations to meet user needs; these were then developed into consistent design principles, components and patterns.
Outcome
The alpha phase delivered functional requirements and user interface design specifications for the BI teams who will build, own and operate scorecards. The user-centred approach helped to:
- make a clear, evidence-based case for change
- gain buy-in from users by involving them from the outset
- create a roadmap for future versions, for example including more near-real-time data and features such as predictive modelling
- reduce overall development costs by having tested, agreed designs before starting development
For more information on our user-centred design approaches, please contact the team.