Skip to main content

2c – Developing plans: evidencing what works

The benefits of evidence-informed decision making are well established. Not using evidence to inform decision making (where evidence is available) risks doing the wrong thing—leading to sub-optimal or no intended outcomes, opportunity costs of wasting money and effort, and perhaps avoidable harms). Relying on common sense, expert opinion or experience can be problematic (the classic example being Dr Spock’s fatal advice to sleep infants on their fronts). Evidence is not the only influence on decision making (see section 2d).

Defining the question
The importance of articulating a clear question as a prelude to finding appropriate evidence cannot be overstated. A good question helps ensure the evidence fits the question, rather than the question morphing to fit the evidence. It:

  • Guides your search strategy (approach to selecting sources and syntax).
  • Guides whether evidence you found is applicable to the issue/ problem at hand (i.e. selection and appraisal).
  • Guides how you might approach evidence synthesis (i.e. make coherent sense of what you deem applicable).
  • Strikes a balance across the dimensions of clarity, focus and complexity.

The PICO mnemonic is a helpful device (there are other variations) for teasing out your question’s bespoke anatomy and designing your search strategy:

  • Patient, Population or Problem e.g. Age 65+ in Wales or Type 2 diabetes
  • Intervention e.g. exercise referral or lifestyle advice
  • Comparator(s) e.g. usual care or prescribed Wonderstatin
  • Outcome(s) e.g. 10% weight loss, CHD risk score

Finding evidence
Evidence may be sought on what works for keeping people well or to support responses to (illness related) needs. Evidence refers to findings from research and information from other sources that have value in helping to reach decisions. Evidence:

  • Requires bespoke judgement as to whether the type/ source is proportionate to the decision it will inform (no formula, sorry). What constitutes relevant evidence will also depend upon the question being asked (see above).
  • May be lacking, in which case there is an option to innovate (section 5) and evaluate (section 4). This differs from having evidence of no effect; in that case, don’t do it!
  • Should ideally describe what works, for whom, and in what context. This requires elucidation of factors that might influence whether an intervention that is predicted to work under ideal conditions (efficacy) will actually do so in practice (effectiveness)—including within the NHS Wales context.
  • May be characterised as a hierarchy by study design or, with a population health lens, from (generally) best to least reliable: evidence-based guidelines (e.g. NICE); systematic reviews of research (i.e. secondary sources); primary research (single studies of different designs serving different purposes); and “not research”. However, robust service evaluations can be highly valuable.

Aspire to be transparent about how evidence is sought, selected and appraised and acknowledge the limitations of your chosen approach.

Sources of evidence
Evidence may be obtained from one or more of the following source categories:

  • Summaries that integrate evidence from lower in the hierarchy and across multiple sources (see below).
  • Bibliographic databases, such as MEDLINE, which can be accessed via the NHS Wales e-library.
  • Websites belonging to organisations containing relevant policy or professional care standards e.g. RCGP.
  • Subject-specific journals, which can be searched for relevant articles online or by hand.
  • Search portals e.g. Trip or NHS Knowledge and Library Hub (both permit PICO term entry)
  • Search engines e.g. Google or Google Scholar can be skilfully integrated to improve the specificity of results; application of the CRAP test is advised (Currency; Reliability/ Relevance; Authority/ Audience; and Purpose/ Point of view).

Sources of pre-digested summaries (secondary evidence) for improvement actions include:

  • NICE Guidance: Evidence-based recommendations developed by independent committees, including professionals and lay members, and consulted on by stakeholders.
  • NICE Quality standards: Setting out priority areas for quality improvement; highlighting areas with identified variation in current practice.
  • NICE Clinical knowledge summaries: Providing primary care practitioners with a readily accessible summary of the current evidence base and practical guidance on best practice.
  • NICE Update for primary care newsletter: Subscribe to receive monthly news and guidance for GPs and primary care staff.
  • Reviews within the Cochrane Library: A collection of databases that contain different types of high-quality, independent evidence to inform healthcare decision-making.
  • BMJ Best Practice: An evidence-based generalist point of care tool, uniquely structured around the patient consultation with advice on symptom evaluation, test ordering and treatment approach.
  • PHW Observatory Evidence Service (OES): Systematic reviews, evidence maps and rapid summaries; collections cover Covid-19, health behaviours, and wider determinants of health. See also the glossary and list of sources for robust secondary evidence, as used by OES to create evidence maps.

See also Population health information by topic  which incorporates topic-specific improvement action options.

Selecting and appraising evidence
Critical appraisal is “the process of carefully and systematically examining research evidence to judge its trustworthiness, its value and relevance in a particular context” (Burls 2009). As peer review is not a guarantee of quality, some level of critical appraisal should be performed for all evidence you have selected as relevant, however published. Academics/ professional appraisers typically apply study design-specific checklists to journal articles (e.g. CASP); you can ask the following critical questions to help sift the wheat from the chaff (see also BMJ’s How to read a paper collection):

  • Is it of interest? Scan the title/ abstract.
  • Why was it done? Scan the introduction.
  • How was it done? Scan the methods section.
  • What was found? San the results section.
  • What are the implications? Scan the abstract/ discussion and think about contextualisation (Can this be adapted to fit the local/ Welsh context?).
  • Who funded it? Does the funding source/ declaration of interests suggest potential for bias?

Shared learning and expertise
The benefits of sharing knowledge and experience (which includes both successes and failures) should speak for themselves. Collaborative working is one way to facilitate this, however, we are generally poor at sharing knowledge (e.g. evaluation reports) and implementing prior learning, both within and across healthcare organisations in Wales. The same may be said of sharing with and from communities/ service users. Options to strength planning use of local experiential evidence may include:

  • Cluster yearbooks, which can be a source for discovery of what has worked elsewhere in Wales (2019; TBC)
  • Presentations/ workshops at the National Primary Care Conference
  • Ad hoc learning events
  • Networking via participation in various primary care fora (e.g. Cluster Leads Network)
  • Undertaking to evaluate and publish on innovative approaches (see section 5)
  • Actively seeking partnership opportunities (see asset-informed approaches, section 2a)
  • Service user engagement/ co-production (see participatory approaches, section 2a)
  • Implement learning from the Pacesetter Programme (see section 5).

A simple template to support wider sharing of learning from elsewhere in Wales could be framed around the following questions:

  • What problem was being addressed?
  • What was done to address it?
  • How does this evidence good practice?
  • What key learning can be shared?
  • Who did it or who can be contacted in the event of queries?

Supporting resources
Development opportunities for cluster teams/ supporting staff may include:

  • Becoming familiar with the “evidence hierarchy” and considering the fitness-for-purpose of evidence sources/ synthesis methods.
  • Being able to use Google (or other search tools) more effectively.
  • Learning about what an OpenAthens account from the NHS Wales e-Library offers (authenticated access to electronic resources).
  • Local public health teams (while still part of PHW) have access to suite of evidence guides (intranet only), for use by PHW staff only. These cover asking the question; finding the evidence; evidence reviewing; critical appraisal; and evidence into action. However, “Under no circumstances should they be replicated or shared with external organisations unless this has first been discussed with the Evidence Service.”