Clusters should be able to access appropriate and timely expert planning advice and support, reference current planning guidance, and access suitable training/ resources to develop in-house planning capabilities.
National planning guidance
National planning guidance is available to inform cluster annual plans in the following guises:
National training and learning resources
Opportunities to develop in-house planning capability and capacity may be sought. National resources include the following:
Local planning advice and support
Advice and support to clusters may be available from your local primary care team, including via cluster development support officers (CDSOs).
Local public health advice and support
Public health advice can be wide-ranging and will typically advocate for planning focus on improving population health outcomes, prioritising prevention/ early detection of avoidable harms, and reduction of inequalities in access and health outcomes. Advice and support to clusters may be available from your local public health team:
Developing plans broadly entails understanding local population health needs and patterns of service utilisation (this section), identification of evidence-based options for improvement action (2c) and agreement of priorities for implementation (2d). Traditional health needs assessment looks at populations through the "lens" of aggregate data focussed on specific diseases (in contrast to population health management; see 2b below). It often depends largely upon routinely-collected national data derived partly from non-clinical sources, but will also incorporate local clinical data where this is available.
Assessment and planning footprints in Wales
Multiple directives require pairing of population health needs assessments (HNAs) and planning outputs across various geographic and/ or structural footprints in Wales. An overview of these is provided on the following page:
Healthcare assessment and planning footprints in Wales: A summary of assessment and planning requirements
Data-centric approaches
Traditional needs assessment involves collating and presenting relevant health intelligence data available from national and health board sources. Where this is the only approach utilised, it may be termed “desk-based needs assessment”. These data can take several forms, but typically address topics/ conditions in isolation; take poor account of co-morbidities; and lack insights from linkage between datasets. Indicators relating to process/ performance or outcomes of interest can be considered as follows:
Health intelligence data sources
Health intelligence in Wales is provided by a number of organisations, each typically producing a number of distinct products, each of which may present information in unique ways. Accessing such a diverse array of specialist resources can be intimidating and interpreting them challenging—particularly for those needing to do so infrequently.
Key health intelligence data sources: An index of major national data sources that may inform local needs assessments
See also Population health information by topic which incorporates topic-specific signposting to data analyses relating to local needs.
Asset-informed approaches
The Welsh Government definition of prevention is broad: “working in partnership to co-produce the best outcomes possible, utilising the strengths and assets people and places have to contribute”. The Covid pandemic spawned “pop-up” health and well-being services within community assets, considered “vital provision within geographically isolated communities where deprivation and traditional barriers to inclusion require easy to access services, close to where people live” (BCT, 2020). Consider any relevant local assets or potential partner organisations that might facilitate co-production. The following local asset indexes may help identify these:
Participatory approaches
Incorporating the views of stakeholders on questions of service redesign is sometimes termed a “corporate” approach. Participation and engagement may reference any rung on the “participation ladder”, typically ranging from informing (“doing for”) to co-producing (“doing with”). Co-production could position stakeholders as integral partners involved in the design, conduct and interpretation of local needs assessments. Methods for gathering views can include surveys, feedback analyses, focus groups, key informant interviews, etc. Broadly, two distinct groups may contribute views:
See also ACD Toolkit which covers communication and engagement.
Vulnerable and marginalised groups
Some groups have less opportunity to participate/have their voice heard during the conduct of needs assessments or in conversations around service redesign. While there is overlap between point-of-care focus on vulnerable and marginalised groups and the much broader social inequalities agenda, it can be helpful to clarify the approach taken as follows:
For vulnerable or marginalised groups, the “gentle” slope of inequality (e.g. depicted as a plot of outcome vs. deprivation quintile) can seem more akin to a cliff edge. Primary care services advocate to reconfigure services so they deliver improvements for all, but should recognise the additional effort needed for those groups with greater unmet needs (this is the concept of proportionate universalism).
For signposting to professional collaborative actions, refer to the following:
Vulnerable and marginalised groups: Identifying needs of vulnerable and marginalised groups
Approaches to data synthesis
Data gleaned from multiple sources will need collating and/or analyses and formatting for presentation. For example, you might triangulate findings from the international literature, local profiling and community/ professional voices into an evidence-informed narrative. You might supplement a technical tome with more accessible infographics. There are many possible ways to approach data synthesis; a helpful structure to support well-informed commissioning decisions is as follows:
Approaches to data synthesis: Four key questions to help structure a population needs narrative
Understanding variation
Quantitate data and other information may indicate variation in access to or outcomes from care, which may or may not correspond to the unmet needs of known vulnerable groups. Needs assessments should actively seek and discuss such variation:
Support resources
General guidance, training and advice may support clusters with needs assessment requirements. This could include the following:
Population health management is an alternative “lens” to traditional needs assessment. It involves looking at the same population using patient-level data arranged into needs-based segments or clusters. It explores resource utilisation based on commonality of risk to describe care needs, facilitating optimisation of care provision and resource use in line with prudent healthcare principles.
Population health management as an approach in Wales has been piloted in Cwm Taf Morgannwg (CTM) University Health Board and comprises the closely-linked components of segmentation, risk stratification and case-mix adjusted variation analysis. Importantly, traditional needs assessment and population health management should be seen as complementary as they address distinct planning and healthcare delivery needs; they both share the challenge of converting data into actionable intelligence.
Segmentation
Segmentation has the following features:
Risk stratification
Risk stratification has the following features:
Case-mix adjusted variation
Case-mix adjusted variation analyses build on the segmentation and risk stratification data and have the following features:
Local implementations
There is no all-Wales programme approach to population health management. A proposal sponsored by Directors of Public Health to develop the approach with a small number of clusters in all health boards was made via the National Primary Care Board; this secured support in principle from Welsh Government but is yet to be funded, partly due to the Covid pandemic. For an update on current activity and to determine whether these tools are available to support needs assessment, clusters should contact their primary care and/or local public health teams.
The benefits of evidence-informed decision making are well established. Not using evidence to inform decision making (where evidence is available) risks doing the wrong thing—leading to sub-optimal or no intended outcomes, opportunity costs of wasting money and effort, and perhaps avoidable harms). Relying on common sense, expert opinion or experience can be problematic (the classic example being Dr Spock’s fatal advice to sleep infants on their fronts). Evidence is not the only influence on decision making (see section 2d).
Defining the question
The importance of articulating a clear question as a prelude to finding appropriate evidence cannot be overstated. A good question helps ensure the evidence fits the question, rather than the question morphing to fit the evidence. It:
The PICO mnemonic is a helpful device (there are other variations) for teasing out your question’s bespoke anatomy and designing your search strategy:
Finding evidence
Evidence may be sought on what works for keeping people well or to support responses to (illness related) needs. Evidence refers to findings from research and information from other sources that have value in helping to reach decisions. Evidence:
Aspire to be transparent about how evidence is sought, selected and appraised and acknowledge the limitations of your chosen approach.
Sources of evidence
Evidence may be obtained from one or more of the following source categories:
Sources of pre-digested summaries (secondary evidence) for improvement actions include:
See also Population health information by topic which incorporates topic-specific improvement action options.
Selecting and appraising evidence
Critical appraisal is “the process of carefully and systematically examining research evidence to judge its trustworthiness, its value and relevance in a particular context” (Burls 2009). As peer review is not a guarantee of quality, some level of critical appraisal should be performed for all evidence you have selected as relevant, however published. Academics/ professional appraisers typically apply study design-specific checklists to journal articles (e.g. CASP); you can ask the following critical questions to help sift the wheat from the chaff (see also BMJ’s How to read a paper collection):
Shared learning and expertise
The benefits of sharing knowledge and experience (which includes both successes and failures) should speak for themselves. Collaborative working is one way to facilitate this, however, we are generally poor at sharing knowledge (e.g. evaluation reports) and implementing prior learning, both within and across healthcare organisations in Wales. The same may be said of sharing with and from communities/ service users. Options to strength planning use of local experiential evidence may include:
A simple template to support wider sharing of learning from elsewhere in Wales could be framed around the following questions:
Supporting resources
Development opportunities for cluster teams/ supporting staff may include:
Agreeing priorities for implementation should reflect the primacy of prevention (to reduce the avoidable burden of disease in Wales) and take account of various strategic directions to facilitate planning alignment, such that joint action produces measurable improvements in population health.
Key strategic direction
Alignment of cluster activities to key strategy documents will help ensure the priorities of other local agencies and partnerships (e.g. Public Service Boards) and national bodies (e.g. Public Health Wales) influence planned cluster actions—in addition to priorities identified locally. Key strategic direction can be found within the following:
See also Population health information by topic which incorporates topic-specific strategic context.
Types of decision making
The Stacey matrix is a means to understand different types of decision making; this is described further in Resources to help you develop your cluster (part of Cluster working in Wales):
Political decisions can be influenced by factors including:
Developing a business case
The combination of data describing unmet needs, evidence for remedial action, alignment to existing strategic direction, and proposal for prioritisation may be captured within an outline business case to support formal decision making. The business case will ordinarily be revisited in preparation for implementation (see section 3) and evolve as needed in response to influences on it—which could even invalidate it.
Other prioritisation resources
The following additional resources may support cluster decision making:
Having decided on a course of action—perhaps informed by an outline business case—the case for change may need refining/ updating to reflect more detailed operational planning considerations as these emerge. Project plans will need clearly-articulated ambitions, ongoing active management, identification of funding and workforce requirements, and sense-checking against population health thinking.
SMART outcomes and logic models
Outcomes describe what we are trying to achieve (for service users or others) by doing the activity; they are not the same as outputs—the things we produce (such as a report) in the process of carrying out the activity. It may help to think of outcomes as aims, and of outputs as project objectives or products:
Project management
Project management can be defined as “the discipline of applying specific processes and principles to initiate, plan, execute and manage the way that new initiatives or changes are implemented within an organization” (Axelos). A project is a “temporary venture that exists to produce a defined outcome” (Axelos) that requires initiation, planning, execution/ delivery, monitoring (see also section 4), and closing down. Various methodologies and templates exist to support each of these stages:
See also ACD Toolkit which provides project management templates.
Funding sources
Funding available to support cluster-based initiatives may come from a variety of sources, encompassing one or more of the following:
Financial stipulations and support
Clusters will need to be mindful of the need to comply with legal requirements and demonstrate due diligence with regard to procurement processes:
Workforce planning and support
Consult the following guidance, templates, and workforce sustainability tools:
Capital and estates planning
Any queries regarding estates and accommodation should in the first instance be directed to the Estates Team within the appropriate health board or local authority.
Welsh Government funding for health and social care premises is accessible via the Health and Social Care Integration and Rebalancing Capital Fund (IRCF):
Population health perspectives
The PACE checklist reflects the values that public health specialists aspire to bring to conversations about how services could be reconfigured to best effect. It doesn’t have to be applied systematically, but could serve as a mental prompt to help clusters sense-check emerging plans against population health thinking:
PACE: A population health perspective checklist for cluster planning
Monitoring refers to setting targets and milestones to measure progress and achievement, and check whether the inputs are producing the planned outputs i.e. it determines whether implementation is proving consistent with design intent—implying we can tweak our approach during the monitoring period. Evaluation is not just about demonstrating eventual success; it also provides insights into why things don’t work (as learning from mistakes has equal value). Monitoring and evaluation are not about finding out about everything (which is intimidating), but are focused on the things that matter.
Project monitoring
In a generic project management context, monitoring involves “oversee[ing] the progress of project work and updat[ing] the project plans to reflect actual performance (Axelos). In the Welsh healthcare context, the Quality and Safety Framework (WG; 2021) describes a universal duty of “quality management” to ensure that care meets the six domains of quality (care that is safe, effective, patient-centred, timely, efficient and equitable). It describes a system that continuously connects quality assurance, planning and improvement activity. Periodic measuring and monitoring permits:
Project evaluation
Evaluation refers to the structured process of assessing the success of a project or programme in meeting its aims and for reflecting on the lessons learned. The key difference between monitoring and evaluation is that evaluation places a value judgement on the information gathered during a project (Research Councils UK; 2011), including the monitoring data. The assessment of a project’s success (its evaluation) can be different depending upon whose value judgement is used. Evaluation permits:
Service evaluations may evolve into research proposals (perhaps aiming to resolve unanswered questions) or lead to review of the existing business case (see section 3), resulting in a decision to scale up a successful, innovative project (see section 5), continue as-is or with improvements, or to stop the project. Evaluation is:
Logic models
Logic models can help sense-check the elements that must come together to successfully plan, deliver and evaluate a project. They can be integrated into project plans from the outset, or inform a bespoke monitoring and evaluation plan by teasing out the following:
A logic model tries to establish sequential links between the above elements, in multi-row table or diagram form. Sometimes they are easier to populate right-to-left, instead of left-to-right (starting with inputs). For background information on logic models, refer to the following resources:
For a simple logic model template, see “Additional support resources” (below).
Evaluation plans
There is no magic formula for developing a universal evaluation plan. If evaluation was something of an afterthought (it happens!), a reflective and inclusive post-project review can recover some value by asking “What went well? What went less well? How would we do it differently next time?” A simple prospective evaluation plan might ask the following:
For a simple evaluation plan template, see “Additional support resources” (below); make sure to address each evaluation question on its own row. The Cluster Governance: A Guide to Good Practice offers examples of real-world cluster evaluations to learn from: audiology advanced practitioners; treatment unit; Mind in the Vale of Glamorgan evaluation/ therapies; and care home ANP innovation/ evaluation.
PCMW/ ACD monitoring and evaluation plan
The Primary Care Model for Wales (PCMW) and Accelerated Cluster Development (ACD) Programme implementation monitoring and evaluation plan sets out how these transformation ambitions will provide assurance of progress, shared learning, and support joining up of local and regional plans. It describes the step-wise introduction of several supporting tools and products:
See also ACD Toolkit which details the PCMW/ACD monitoring and evaluation plan.
Additional support resources
The following resources provide further background on defining monitoring and evaluation requirements:
Clusters can play a key role in developing novel approaches to address local challenges, identifying successful projects for upscaling/ mainstreaming, and adapting or implementing prior learning from across Wales.
Developing innovative ideas
Innovation involves development of “new or improved health policies, systems, products and technologies, and services and delivery methods that improve people's health, with a special focus on the needs of vulnerable populations” (WHO, 2016). The following resources offer insights into getting started and pitching cluster ideas in supportive environments:
Driver diagrams to support cluster innovation
Driver diagrams offer a tool to assist planning of improvement projects. They can:
This structured approach assists the allocation of tasks to individuals or groups and provides an estimate of the skills and capacity to deliver the agreed actions. This also encourages the prioritisation of objectives where there are multiple competing expectations.
Primary and community services are complex and it can be challenging to deliver innovation in these settings. Driver diagrams can be used to gain clinical engagement (by communicating the project in logical sequence and with defined tasks) and to clarify what can reasonably be expected within the objectives of a small cluster team. Health boards may also develop innovation teams with skills and capacity to enhance local teams for identified priorities.
Tasks that cannot be accommodated should be added to local risk registers to provide a clear analysis of the unaddressed potential for improvement.
Professional collaboratives should be encouraged to generate improvement proposals as independent groups and across system boundaries. Pan-cluster planning groups (PCPGs) should establish systems to receive and consider these submissions, ensuring that improvement efforts are addressed to the agreed local priorities. An evaluation should be integral to all proposals and learning should be shared. A schedule of current projects should be maintained to monitor progress and ensure that cycles of change are completed.
Example driver diagrams include:
Upscaling from pilot projects
Pilot projects serve to differentiate that which works from that which does not. Taking things that do work on a small scale to a larger scale (e.g. health board or all-Wales footprint) can be challenging; advice is contained with the following resources:
Learning from the Pacesetter Programme
Learning captured by the National Primary Care Pacesetter Programme critical appraisal (University of Birmingham, 2018) identified six transformation enablers. These enablers are recognised as key to successful transformation of health systems, both in the UK and internationally. Note that Pacesetters are superseded by the Strategic Programme for Primary Care Fund from April 2022: