Skip to main content

Advertisement

Selecting and implementing overview methods: implications from five exemplar overviews

Article metrics

Abstract

Background

Overviews of systematic reviews are an increasingly popular method of evidence synthesis; there is a lack of clear guidance for completing overviews and a number of methodological challenges. At the UK Cochrane Symposium 2016, methodological challenges of five overviews were explored. Using data from these five overviews, practical implications to support methodological decision making of authors writing protocols for future overviews are proposed.

Methods

Methods, and their justification, from the five exemplar overviews were tabulated and compared with areas of debate identified within current literature. Key methodological challenges and implications for development of overview protocols were generated and synthesised into a list, discussed and refined until there was consensus.

Results

Methodological features of three Cochrane overviews, one overview of diagnostic test accuracy and one mixed methods overview have been summarised. Methods of selection of reviews and data extraction were similar. Either the AMSTAR or ROBIS tool was used to assess quality of included reviews. The GRADE approach was most commonly used to assess quality of evidence within the reviews.

Eight key methodological challenges were identified from the exemplar overviews. There was good agreement between our findings and emerging areas of debate within a recent published synthesis. Implications for development of protocols for future overviews were identified.

Conclusions

Overviews are a relatively new methodological innovation, and there are currently substantial variations in the methodological approaches used within different overviews. There are considerable methodological challenges for which optimal solutions are not necessarily yet known. Lessons learnt from five exemplar overviews highlight a number of methodological decisions which may be beneficial to consider during the development of an overview protocol.

Background

Overviews of systematic reviews are an increasingly popular method of evidence synthesis [1, 2], and there are a growing number of resources, including guidelines, recommendations, descriptions and systematic reviews, relating to overview methods [2,3,4,5,6]. While there are some areas of agreement in relation to optimal overview methods, particularly in the early stages of overview completion [5], there remains considerable uncertainty around some key areas of methodology [3, 5, 7, 8] and a need for clearer standards and reporting guidance, supported by research evidence, to enhance methodological quality of overviews [1,2,3, 5, 6].

At the UK Cochrane Symposium in 2016, a workshop focusing on the methods and challenges associated with overviews included presentations relating to five selected ongoing or recently completed overviews [9]. These were selected in order to highlight practical variations in the methods adopted within different overviews and to provide tangible examples of the decisions and challenges associated with the completion of an overview. During and following this workshop, we explored practical issues associated with planning and preparing these exemplar overviews, discussed the impact of methodological decisions and reached consensus on key methodological challenges and potential implications for future overview authors.

Subsequent to our author group reaching consensus, Ballard [3] published the results of a scoping review of methodological guidance for overviews. This synthesis identified areas where there was consensus relating to overview methods and highlighted five key areas where there was debate or uncertainty including “(i) overlapping systematic reviews, (ii) the scope of systematic reviews, (iii) evaluating the quality and reporting of included research, (iv) updating included systematic reviews, and (v) synthesizing and reporting the results of included systematic reviews” [3]. Since the search period of this comprehensive synthesis, a number of further papers relevant to overview methods have been published adding to the discussion and debate around key areas of uncertainty. These include summaries of guidance relating to overview methods [5], descriptions and debate relating to the use of GRADE [10,11,12] and AMSTAR [13], the development of a new tool to assess risk of bias (ROBIS [14]), and protocols for ongoing work in this field [15, 16]. The specific debate around methodological challenges presented by Ballard [3], and these subsequent related publications, afforded us a timely opportunity to compare the results of our independent consensus arising from our five exemplar overviews and to explore the practical implications of the current uncertainties around overview methods for authors planning protocols for new overviews.

Our aim is therefore to use five exemplar overviews:

  1. (i)

    To provide practical examples of methodological approaches to the planning and preparing of overviews and discuss the impact of methodological decisions

  2. (ii)

    To explore methodological challenges reported by overview authors and compare these with areas of debate identified within current literature

  3. (iii)

    To discuss practical implications which may support methodological decision making during development of protocols for future overviews

Methods

Methodological features of exemplar overviews

Methods of the five exemplar overviews were tabulated, relating to all key stages in the process of completing an overview. Justification for the selection and use of the methods at each stage were provided by the overview authors, based on the presentations provided at the Cochrane Workshop and supplemented by discussion (authors representing all five exemplar overviews are included as authors on this paper). Points of agreement and dissonance were highlighted and discussed. The key methodological challenges identified by each overview author were discussed at the workshop and synthesised into a list, with similar challenges merged. A description of each challenge, and the solutions (if any) implemented within individual overviews, was developed through discussion. A list of potential implications for future overview authors, arising from the practical experiences within these exemplar overviews, was developed iteratively. This initially comprised a list of key methodological decisions made by the overview authors arising from the tabulated descriptions of methods and methodological challenges; this was circulated amongst the overview authors who added to, and refined until consensus was reached on the final list.

Key challenges within exemplar overviews and complementarity with published literature on overview methods

The key methodological challenges identified within each of the exemplar overviews were systematically compared and contrasted with the recognised areas of debate [3], which were published after consensus had been reached on the methodological challenges identified within our exemplar overviews. Each of the methodological challenges from our exemplar overviews were tabulated, and overview authors considered levels of complementarity [17, 18] between the recognised areas of debate and the methodological challenges identified from our exemplar overviews, applying categories of “agreement” (the methodological challenges identified within the exemplar overviews is in agreement with the findings of Ballard [3]), “dissonance” (the methodological challenges identified within the exemplar overviews differs from, or conflicts with, the findings of Ballard [3]) or “silence” (the methodological challenges identified within the exemplar overviews were not addressed within the findings of Ballard [3, 17, 18]. One author applied the initial categorisation, which were then appraised by the other authors and any areas of disagreement highlighted. Disagreements were discussed between all authors until consensus was reached, the findings of other recent publications were considered, and any changes from the original categorisation noted.

Implications for development of protocols for future overviews

Finally, based on the presentations on the methods of the exemplar overviews from the Cochrane Symposium which all provided a chronological description of the overview process and the synthesis of methodological features, common features and differences between our exemplar overviews were identified and agreed. Considering these features and differences alongside the perceived methodological challenges and complementarity with published literature, overview authors then debated and agreed key implications for the development of protocols for future overviews.

Results

Methodological features of exemplar overviews

Table 1 summarises the key methodological features of the five exemplar overviews. Three of the overviews are Cochrane overviews [19,20,21], one is an overview of reviews of diagnostic test accuracy (DTA) [22], and one a mixed method overview carried out by the EPPI centre [23]. Two of the Cochrane overviews are produced by the same research team (AP is an author on [19] and [21]), while the remaining three are each produced by distinct author groups. Detailed protocols describing the planned methods were agreed and made available a priori for all five of the overviews; two of the overviews are now complete and published [19, 23], while three are in the final stages of completion and write-up [20,21,22]. Justification for the selection and use of the methods as described in the protocol and used at each stage of the overview process are provided in Table 1. Characteristics of the included reviews and any data relating to the results of meta-analyses which were extracted and reported within the systematic reviews are summarised in Table 2.

Table 1 Methodological features of exemplar overviews
Table 2 Characteristics of included reviews which are reported in tables in overviews

Key challenges within exemplar overviews and complementarity with published literature on overview methods

Discussion between overview authors led to agreement that there were a total of eight key methodological challenges encountered across all the exemplar overviews. These were the following:

Table 3 Summary of the perceived key methodological challenges associated with each of the exemplar overviews, a description of what the challenge was, and examples of how this challenge was dealt with within individual overviews
  • Overlap between reviews (studies appearing in more than one review)

  • Reviews are out of date

  • Definition of “systematic review”

  • Assessment of methodological quality of reviews

  • Quality of reporting within reviews

  • Applying GRADE

  • Potential for publication bias

  • Summarising key findings in brief accessible format, suitable for informing decision making

The judgement of complementarity between these identified methodological challenges and the areas of debate identified within current literature are summarised in Table 3 and below.

Complementarity with published literature

  • Agreement: seven of our identified key methodological challenges were in agreement with the issues raised by Ballard [3]; although, some additional specific points (categorised as areas of dissonance) were raised by our exemplar overviews in relation to three of the issues (see Table 3 for details)

  • Silence:

    • o Methodological challenge highlighted by our overviews, but not raised by Ballard [3]): there was one area of silence, with our exemplar overviews identifying challenges relating to gaining agreement with the ROBIS tool. (Note: the ROBIS tool was published after the search period of Ballard [3])

    • o Methodological issue identified by Ballard [3] but not from our exemplar overviews: there were two areas of silence. These included challenges relating to the “scope of systematic reviews”, where there is a “mismatch” between the scope of the systematic review and the remit or focus of the overview, and challenges associated with the assessment of risk of bias of primary trials, where appropriate quality assessment was not used within the systematic review

  • Dissonance: there were no areas of dissonance between the methodological issues raised by our overviews and Ballard [3]. However, there was disagreement noted in relation to conclusions drawn by Ballard [3] relating to the function of overviews and that overviews cannot identify evidence gaps: three of our exemplar overviews [19, 21, 23] clearly concluded that the overview had successfully identified gaps in the evidence.

Challenges common to systematic reviews

Also common to all exemplar overviews were a number of challenges which were considered to be routine amongst most systematic reviews. These included challenges such as the management of large volumes of review information and data extraction, the available time and resources, and reaching consensual decisions amongst overview authors. Strategies implemented to address these challenges included methods of automating data extraction (for example downloading data files for Cochrane reviews) and audio-recording discussions between overview authors.

Implications for development of protocols for future overviews

The identified common methodological features, implications for future overviews and implications for development of protocols for future overviews are summarised in Table 4.

Table 4 Summary of common features and differences between the exemplar overviews, and implications for development of protocols for future overviews

Discussion

Using five recently completed or ongoing overviews of reviews, we have explored the methodological features of overviews and the key methodological challenges which are reported by the overview authors and have systematically compared these findings with synthesised evidence relating to current guidance for overview methods (Ballard [3]). Our five overviews provide examples of a range of different types of overviews, including Cochrane overviews, mixed method overviews and overviews of reviews of diagnostic test accuracy. There are some methodological differences between our exemplar overviews which cannot be attributed to the type or aim of the overviews, and these arguably occur due to lack of information and guidance on the optimal methods for overviews and reinforce recent calls for methodological research and improved guidance for overviews [2, 3, 5, 24, 25].

Eight key methodological challenges were encountered during our five exemplar overviews; these challenges were clearly aligned with the areas of debate identified within current literature [3], with no areas of dissonance. Areas of silence where a methodological issue was identified by our exemplar overviews and not by the synthesis by Ballard, and vice versa, can arguably be explained by differing levels of reporting within these syntheses. Where there were methods debated within our exemplar overviews which were not discussed in the synthesis by Ballard [3], these were always related to a larger topic or issue for which there was no consensus on an optimal approach, for example dealing with overlap between reviews.

Our exemplar overviews supported the conclusion that overviews can successfully identify gaps in the evidence; this conflicts with Ballard [3] who concludes that overviews cannot fulfil the function of identifying evidence gaps. However, Ballard [3] qualifies this, stating that “overviews that fail to find a systematic review for every relevant comparison will not, by default, detect evidence gaps”. Our experiences suggest that there are situations where evidence gaps will be identified, specifically when there is documented knowledge of current clinical practice or existing interventions. For example, during protocol development McClurg [21] consulted with a stakeholder group comprising expert clinicians and patients, creating a comprehensive list and taxonomy of interventions delivered within clinical practice relevant to the scope of the overview. While Ballard implies that overviews cannot fulfil the function of identifying evidence gaps, we argue that this function can be fulfilled, where there is knowledge of existing interventions or current clinical practice. This supports the viewpoint that the involvement of key stakeholders within the overview process should be an essential component of overview methodology, serving to increase relevance, quality and rigor and reduce research waste, as has been proposed for systematic reviews [26,27,28,29,30].

Our experiences as authors completing overviews, despite an absence of guidance for many methodological features, and the evidence of the increasing number of new overviews published each year [1, 2] highlight that there is a need for recommendations and support for authors wishing to complete overviews during this time of methodological uncertainty. Clearly, the ultimate goal must be to address the uncertainties and establish evidence-based guidance for overview methods, and there is ongoing work which aims to inform and help prioritise future research in this field [15]. However, while there remains a lack of guidance and recommendations for optimal methods, there is currently decision-making required by the authors of future overviews. Until sufficient guidance is available, we recommend that overview authors make and transparently report decisions relating to the inevitable methodological choices. Based on our experiences as overview authors, we have proposed a list of implications to be considered during the development of protocols for future overviews (Table 4). Whilst we anticipate that these implications will be superseded by more robust and evidence-based recommendations as methodological research in this field is completed, we believe that these offer practical advice to those embarking on overviews whilst there remains methodological uncertainty. As this is an active area of research and the methods for overviews continue to develop, overview authors should ensure that they remain up-to-date with any new guidance or information on best practice and should take opportunities to build on the methods of completed overviews. Furthermore, as the methodological features of overviews are broadly derived from, and build upon, methods for systematic reviews of primary research [1, 4, 5], overview authors should utilise guidance and recommendations relating to the conduct of systematic reviews [1]. These include the QUality Of Reporting Of Meta-analyses (QUOROM) statement [31], Methodological Expectations of Cochrane Intervention Reviews guidance (MECIR) [32] and Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, as well as the growing body of guidance specific to overviews [1, 4, 33, 34]. As well as ensuring the overview is carried out to the highest methodological standards, it is essential to ensure that the overview is relevant and useful, and meaningful involvement of key stakeholders should be central to all overviews (Hunt H, Pollock A, Campbell P, Estcourt L, Brunton G: An introduction to overviews of reviews: planning a relevant research question and objective for an overview Systematic Reviews, submitted); resources currently being developed to support systematic review authors in achieving meaningful stakeholder involvement [35, 36] ought to be relevant to authors of overviews.

Strengths and limitations

The aim of this paper was to provide illustrated examples of the methods and challenges associated with a range of overviews. The objective was not to provide specific recommendations about how to do an overview, or to propose optimal methods for overviews. The five exemplar overviews were initially selected using contacts for overview authors whom attended a meeting on overviews at the 2015 Cochrane Colloquium in Vienna. As the exemplar overviews were initially selected for presentation at the 2016 UK Cochrane meeting, all the selected overviews were led by UK-based authors. Two of the exemplar overviews include some of the same authors [19, 21]. Thus, the overviews provided as examples in this paper are selected from a limited population of authors and cannot therefore be assumed to be representative or comprehensive of the range of different overviews which have been published internationally. None of the five exemplar overviews addresses a research question for which there was high quality trial and review evidence and none of these exemplar overviews completed network meta-analyses; these are clear gaps within these example overviews. However, whilst not purporting to be comprehensive, the examples reflect the shared challenges experienced by the authors of these five selected overviews, and those which were perceived to be the greatest or most difficult to deal with. The comparison of the methodological challenges independently identified from these exemplar overviews with the issues emerging from a comprehensive synthesis of current evidence adds significant strength to this paper, confirming that the experiences within these exemplar overviews are aligned with current evidence.

Conclusions

Overviews are a relatively new methodological innovation, and there are currently substantial variations in the methodological approaches used within different overviews. Furthermore, there are considerable methodological challenges for which optimal solutions are not necessarily yet known. This paper has explored the variations in methodological approaches used within five selected overviews, and the challenges reported by the overview authors. Lessons learnt from these overviews have highlighted a number of methodological decisions which may need to be considered during the development of an overview protocol and led to the development of a list of implications to support the development of protocols for future overviews (Table 4).

While there remains a lack of empirical evidence to support selection of specific methodological approaches [1,2,3, 5, 6], authors planning protocols for Cochrane overviews are encouraged to consider and transparently report their decisions in response to a number of questions, including the following:

  • Is the overview to be limited to Cochrane reviews or will other systematic reviews also be included?

  • Is the search strategy to be limited to databases of reviews or will wider electronic databases be searched?

  • What action will be taken if there are overlapping reviews (reviews containing the same trials)?

  • What action will be taken if included reviews are out of date? (How will ‘out of date’ be defined?)

  • What tool will be used to assess the quality of the included reviews? (AMSTAR/ROBIS)

  • How will the quality of evidence within reviews be assessed? Will this be done using GRADE (and if so, how will GRADE be applied to the available evidence)?

  • Is any new statistical analysis to be carried out, using the data extracted from the reviews?

  • How will the evidence be brought together into an accessible summary which is useful to the potential audience/readership of the overview? What information should be included within this (in order to address the overview objective)?

As this is a new and developing methodological field, it is important that overview authors keep up-to-date with new developments and methods research, in order that their decisions relating to the methodological approach for a new overview are informed by current evidence.

References

  1. 1.

    Hartling L, Chisholm A, Thomson D, Dryden DM. A descriptive analysis of overviews of reviews published between 2000 and 2011. PLoS ONE. 2012;7, e49667.

  2. 2.

    Pieper D, Buechter R, Jerinic P, Eikermann M. Overviews of reviews often have limited rigor: a systematic review. J Clin Epidemiol. 2012;65:1267–73.

  3. 3.

    Ballard M, Montgomery P. Risk of bias in overviews of reviews: a scoping review of methodological guidance and four-item checklist. Research Synthesis Methods. 2017;8:92–108.

  4. 4.

    Aromataris E, Fernandez R, Godfrey CM, Holly C, Khalil H, Tungpunkom P. Summarizing systematic reviews: methodological development, conduct and reporting of an umbrella review approach. Int J Evid Based Healthc. 2015;13:132–40.

  5. 5.

    Pollock M, Fernandes R, Becker L, Featherstone R, Hartling L. What guidance is available for researchers conducting overviews of reviews of healthcare interventions? A scoping review and qualitative metasummary. Systematic Reviews. 2016;5:190.

  6. 6.

    Thomson D, Foisy M, Oleszczuk M, Wingert A, Chisholm A, Hartling L. Overview of reviews in child health: evidence synthesis and the knowledge base for a specific population. Evidence-Based Child Health: A Cochrane Review Journal. 2013;8:3–10.

  7. 7.

    Caird J, Sutcliffe K, Kwan I, Dickson K, Thomas J. Mediating policy-relevant evidence at speed: Are systematic reviews of systematic reviews a useful approach? Evidence and Policy. 2015;11:81-97.

  8. 8.

    Smith V, Devane D, Begley CM, Clarke M. Methodology in conducting a systematic review of systematic reviews of healthcare interventions. BMC Med Res Methodol. 2011;11:15.

  9. 9.

    Pollock A, Hunt H, Campbell P, Estcourt L, Brunton G. Cochrane overviews of reviews: exploring the methods and challenges. Birmingham: UK and Ireland Cochrane Symposium; 2016.

  10. 10.

    Pollock A, Farmer SE, Brady MC, Langhorne P, Mead GE, Mehrholz J, Van Wijck F, Wiffen PJ. An algorithm was developed to assign GRADE levels of evidence to comparisons within systematic reviews. J Clin Epidemiol. 2016;70:106–10.

  11. 11.

    Murad MH, Mustafa R, Morgan R, Sultan S, Falck-Ytter Y, Dahm P. Rating the quality of evidence is by necessity a matter of judgment. J Clin Epidemiol. 2016;74:237–8.

  12. 12.

    Gionfriddo MR. Subjectivity is a strength: a comment on “an algorithm was developed to assign GRADE levels of evidence to comparisons within systematic reviews”. J Clin Epidemiol. 2016;74:237.

  13. 13.

    Pollock M, Fernandes RM, Hartling L. Evaluation of AMSTAR to assess the methodological quality of systematic reviews in overviews of reviews of healthcare interventions. BMC Med Res Methodol. 2017;17:48.

  14. 14.

    Whiting P, Savovic J, Higgins JP, Caldwell DM, Reeves BC, Shea B, Davies P, Kleijnen J, Churchill R, group R. ROBIS: a new tool to assess risk of bias in systematic reviews was developed. J Clin Epidemiol. 2016;69:225–34.

  15. 15.

    Lunny C, Brennan SE, McDonald S, McKenzie JE. Evidence map of studies evaluating methods for conducting, interpreting and reporting overviews of systematic reviews of interventions: rationale and design. Systematic Reviews. 2016;5:4.

  16. 16.

    Pollock M, Hartling L: Preferred reporting items for overviews of reviews (PRIOR). EQUATOR Network; 2016. http://www.equator-network.org/library/reporting-guidelines-under-development/#72.

  17. 17.

    Erzberger C, Prein G. Triangulation: validity and empirically-based hypothesis construction. Qual Quant. 1997;31:141–54.

  18. 18.

    O’Cathain A, Murphy E, Nicholl J. Three techniques for integrating data in mixed methods studies. BMJ. 2010;341.

  19. 19.

    Pollock A, Farmer SE, Brady MC, Langhorne P, Mead GE, Mehrholz J, Van Wijck F. Interventions for improving upper limb function after stroke. Cochrane Database Syst Rev. 2014;11:CD010820.

  20. 20.

    Estcourt LJ, Fortin PM, Hopewell S, Trivella M: Red blood cell transfusion to treat or prevent complications in sickle cell disease: an overview of Cochrane reviews. Cochrane Database Syst Rev 2016, 2016.

  21. 21.

    McClurg D, Pollock A, Campbell P, Hazelton C, Elders A, Hagen S, Hill DC, McClurg D: Conservative interventions for urinary incontinence in women: an Overview of Cochrane systematic reviews. Cochrane Database of Systematic Reviews 2016.

  22. 22.

    Hunt H, Kuzma E, C H: A review of existing systematic reviews summarising the accuracy of brief cognitive assessments for identifying dementia, particularly for use in primary care. Protocol..In PROSPERO PROSPERO online; 2016.

  23. 23.

    Brunton G, Dickson K, Khatwa M, Caird J, Oliver S, Hinds K, Thomas J. Developing evidence-informed, employer-led workplace health. London: EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London; 2016.

  24. 24.

    Li L, Tian J, Tian H, Sun R, Liu Y, Yang K. Quality and transparency of overviews of systematic reviews. Journal of Evidence-Based Medicine. 2012;5:166–73.

  25. 25.

    Pieper D, Antoine S-L, Morfeld J-C, Mathes T, Eikermann M. Methodological approaches in conducting overviews: current state in HTA agencies. Research Synthesis Methods. 2014;5:187–99.

  26. 26.

    Boote J, Wong R, Booth A. ‘Talking the talk or walking the walk?’ A bibliometric review of the literature on public involvement in health research published between 1995 and 2009. Health Expect. 2015;18:44–57.

  27. 27.

    INVOLVE. Public involvement in systematic reviews: Supplement to the briefing notes for researchers. Eastleigh: INVOLVE; 2012.

  28. 28.

    Kreis J, Puhan MA, Schunemann HJ, Dickersin K. Consumer involvement in systematic reviews of comparative effectiveness research. Health Expect. 2013;16:323–37.

  29. 29.

    Morley R, Norman G, Golder S, Griffith P. A systematic scoping review of the evidence for consumer involvement in organisations undertaking systematic reviews: focus on Cochrane. Research Involvement and Engagement. 2016;2:36.

  30. 30.

    Serrano-Aguilar P, Trujillo-Martin MM, Ramos-Goni JM, Mahtani-Chugani V, Perestelo-Perez L, Posada-dela Paz M. Patient involvement in health research: a contribution to a systematic review on the effectiveness of treatments for degenerative ataxias. Soc Sci Med. 2009;69:920–5.

  31. 31.

    Moher D, Cook DJ, Eastwood S, Olkin I, Rennie D, Stroup DF. Improving the quality of reports of meta-analyses of randomised controlled trials: the QUOROM statement. Quality of Reporting of Meta-analyses. Lancet. 1999;354:1896–900.

  32. 32.

    MECIR. The Methodological Expectations of Cochrane Intervention Reviews (MECIR).Standards for Cochrane new reviews of interventions and their updates. 2017.

  33. 33.

    Becker L, Oxman A: Chapter 22: Overviews of reviews. In Cochrane Handbook for Systematic Reviews of Interventions Version 510 (JPT H, S G eds.): The Cochrane Collaboration; 2011. Available from www.handbook.cochrane.org.

  34. 34.

    Conn VS, Sells TGC. WJNR Welcomes Umbrella Reviews. West J Nurs Res. 2014;36:147–51.

  35. 35.

    Pollock A, Campbell P, Struthers C, Synnot A, Nunn J, Hill S, Goodare H, Watts C, Morley R. Stakeholder involvement in systematic reviews: a protocol for a systematic review of methods, outcomes and effects. Research Involvement and Engagement. 2017;3:9.

  36. 36.

    ACTIVE: Authors and consumers together impacting on evidence [http://training.cochrane.org/ACTIVE]

  37. 37.

    Pollock A, Brady MC, Farmer SE, Langhorne P, Mead GE, Mehrholz J, Wiffen PJ, Van Wijck F. The purpose of rating quality of evidence differs in an overview, as compared to guidelines or recommendations. J Clin Epidemiol. 2016;74:238–40.

  38. 38.

    Bossuyt PM, Reitsma JB, Bruns DE, Gatsonis CA, Glasziou PP, Irwig LM, Moher D, Rennie D, De Vet HC, Lijmer JG, Standards for Reporting of Diagnostic A. The STARD statement for reporting studies of diagnostic accuracy: explanation and elaboration. Ann Intern Med. 2003;138:W1–12.

  39. 39.

    Whiting PF, Rutjes AW, Westwood ME, Mallett S, Deeks JJ, Reitsma JB, Leeflang MM, Sterne JA, Bossuyt PM, Group Q. QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann Intern Med. 2011;155:529–36.

  40. 40.

    PSTF. U.S. Preventive Services Task Force. Procedure Manual. In: AHRQ publication no 08-05118-EF. Rockville: U.S. Preventive Services Task Force; 2008.

  41. 41.

    NICE. National Institute for Health and Clinical Excellence. The Guidelines Manual. London: National Institute for Health and Clinical Excellence; 2006.

  42. 42.

    Shea BJ, Grimshaw JM, Wells GA, Boers M, Andersson N, Hamel C, Porter AC, Tugwell P, Moher D, Bouter LM. Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews. BMC Med Res Methodol. 2007;7:10.

  43. 43.

    NOS. Newcastle-Ottawa Scale (NOS) for assessing the quality of nonrandomised studies in meta-analyses. Ottawa: Ottawa Hospital Research Institute; 2013.

  44. 44.

    Estcourt LJ, Fortin PM, Hopewell S, Trivella M, Hambleton IR, Cho G. Regular long-term red blood cell transfusions for managing chronic chest complications in sickle cell disease. Cochrane Database Syst Rev. 2016;20:CD008360.

  45. 45.

    Estcourt LJ, Fortin PM, Trivella M, Hopewell S. Preoperative blood transfusions for sickle cell disease. Cochrane Database Syst Rev. 2016;4, CD003149.

  46. 46.

    Estcourt LJ, Fortin PM, Hopewell S, Trivella M, Wang WC. Blood transfusion for preventing primary and secondary stroke in people with sickle cell disease. Cochrane Database Syst Rev. 2017;1, CD003146.

  47. 47.

    Roy NB, Fortin PM, Bull KR, Doree C, Trivella M, Hopewell S, Estcourt LJ: Interventions for chronic kidney disease in people with sickle cell disease. In Cochrane Database of Systematic Reviews: John Wiley & Sons, Ltd; 2016.

  48. 48.

    Estcourt LJ, Fortin PM, Hopewell S, Trivella M, Doree C, Abboud MR: Interventions for preventing silent cerebral infarcts in people with sickle cell disease. In Cochrane Database of Systematic Reviews: John Wiley & Sons, Ltd; 2016.

  49. 49.

    Okusanya BO, Oladapo OT. Prophylactic versus selective blood transfusion for sickle cell disease in pregnancy. Cochrane Database Syst Rev. 2013;3:CD010378.

  50. 50.

    Dastgiri S, Dolatkhah R: Blood transfusions for treating acute chest syndrome in people with sickle cell disease. In Cochrane Database of Systematic Reviews: John Wiley & Sons, Ltd; 2016.

  51. 51.

    Martí-Carvajal AJ, Knight-Madden JM, Martinez-Zapata MJ: Interventions for treating leg ulcers in people with sickle cell disease. In Cochrane Database of Systematic Reviews: John Wiley & Sons, Ltd; 2014.

  52. 52.

    Gough D, Oliver S, Thomas J: Learning from research: systematic reviews for informing policy decisions (a quick guide). http://www.alliance4usefulevidence.org/assets/Alliance-FUE-reviews-booklet-3.pdf: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London; 2013.

  53. 53.

    CCN: What is a systematic review? http://consumers.cochrane.org/what-systematic-review: Cochrane consumers network (CCN); 2017.

  54. 54.

    Pieper D, Antoine SL, Mathes T, Neugebauer EA, Eikermann M. Systematic review finds overlapping reviews were not mentioned in every other overview. J Clin Epidemiol. 2014;67:368–75.

  55. 55.

    Ioannidis JP. The mass production of redundant, misleading, and conflicted systematic reviews and meta-analyses. Milbank Q. 2016;94:485–514.

  56. 56.

    Pieper D, Antoine SL, Neugebauer EA, Eikermann M. Up-to-dateness of reviews is often neglected in overviews: a systematic review. J Clin Epidemiol. 2014;67:1302–8.

  57. 57.

    Epistemonikos: Epistemonikos: Database of the best evidence-based health care. In http://www.epistemonikos.org/; 2017.

  58. 58.

    PDQ-Evidence: PDQ-Evidence for informed health policymaking. In http://www.pdq-evidence.org; 2017.

  59. 59.

    Burda BU, Holmer HK, Norris SL. Limitations of A Measurement Tool to Assess Systematic Reviews (AMSTAR) and suggestions for improvement. Systematic Reviews. 2016;5:58.

  60. 60.

    Faggion CM. Critical appraisal of AMSTAR: challenges, limitations, and potential solutions from the perspective of an assessor. BMC Med Res Methodol. 2015;15:63.

  61. 61.

    Wegewitz U, Weikert B, Fishta A, Jacobs A, Pieper D. Resuming the discussion of AMSTAR: what can (should) be made better? BMC Med Res Methodol. 2016;16:111.

  62. 62.

    EPOC. Effective Practice and Organisation of Care (EPOC): reporting the effects of an intervention in EPOC reviews. Section 24. How to report the effects of an intervention. In: EPOC Resources for review authors. Oslo: Norwegian Knowledge Centre for the Health Services; 2016.

  63. 63.

    Glenton C, Santesso N, Rosenbaum S, Nilsen ES, Rader T, Ciapponi A, Dilkes H. Presenting the results of Cochrane Systematic Reviews to a consumer audience: a qualitative study. Med Decis Making. 2010;30:566–77.

Download references

Acknowledgements

The overview conducted by Pollock [19] was supported by a project grant from the Chief Scientist Office of the Scottish Government. The overview conducted by McClurg [21] was supported by a project grant by the Physiotherapy Research Foundation. The overview by Hunt [22] was supported as part of doctoral programme funding by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care South West Peninsula (PenCLAHRC). The overview conducted by Estcourt [20] was supported by an NIHR Cochrane Programme Grant for the Safe and Appropriate Use of Blood Components. The overview conducted by Brunton [23] was commissioned by the Department of Health as part of an ongoing programme of work on health policy research synthesis.

Alex Pollock is employed by the Nursing, Midwifery and Allied Health Professions (NMAHP) Research Unit, which is supported by the Chief Scientist Office of the Scottish Government. Pauline Campbell is supported by the Chief Nurses Office of the Scottish Government.

The views and opinion expressed herein are those of the authors and do not necessarily reflect those of the funding bodies.

Author information

AP initiated the submission for a workshop at the UK Cochrane Symposium 2016. All authors contributed to the submitted abstract, prepared and delivered original presentations, and participated in group discussions at the UK Cochrane Symposium 2016. AP, GB, HH and LE provided additional methodological details relating to exemplar overviews. AP drafted the initial list of methodological decisions and challenges arising from discussions at the UK Cochrane Symposium and applied the initial categorisation of complementaritywith published literature, and all authors commented and contributed to consensus discussions. AP drafted the original manuscript, and revised this following detailed comments from all authors. All authors read and approved the final manuscript.

Correspondence to Alex Pollock.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional files

Additional file 1:

Summary of findings as presented by Pollock [8]. (DOCX 777 kb)

Additional file 2:

Summary of findings as presented by Brunton [12]. (DOCX 20 kb)

Additional file 3:

Summary of results tables proposed by McClurg [10] and Escourt [9]. (DOCX 22 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Pollock, A., Campbell, P., Brunton, G. et al. Selecting and implementing overview methods: implications from five exemplar overviews. Syst Rev 6, 145 (2017) doi:10.1186/s13643-017-0534-3

Download citation

Keywords

  • Challenges
  • Methods
  • Overviews
  • Quality assessment
  • Synthesis

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate. Please note that comments may be removed without notice if they are flagged by another user or do not comply with our community guidelines.