Current issues of ACP Journal Club are published in Annals of Internal Medicine


Transferring evidence from research into practice: 2. Getting the evidence straight

ACP J Club. 1997 Jan-Feb;126:A14. doi:10.7326/ACPJC-1997-126-1-A14

In Part 1 of this series (1), we considered a simple model for evidence-based clinical decision making that included 3 components: clinical expertise, patient preferences, and evidence from research. The series focuses on applying evidence from clinically relevant research. Three steps are crucial to the timely introduction of evidence into clinical decisions: getting the evidence straight, developing clinical policy from evidence, and applying the policy at the right place and time. In this essay, we consider getting the evidence straight so that the current best evidence becomes readily available for use in making clinical decisions.

We begin by acknowledging that many of nature's secrets are not easily revealed and that most of our current medical knowledge has come by the 2-steps-forward-1-step-back shuffle. Our initial research ideas and “insights” are usually found wanting when put to the test. Recognizing this, industrialized countries have invested heavily during the past half century in the discovery and testing processes required to sort the few good ideas from the many bad ones. This process of separating good from bad, noise from signal, is expensive and messy. Usually, thousands of studies must be done to find and verify one clinically useful truth. These thousands of studies are published in biomedical and clinical journals and serve well as communication among scientists, but clinical practitioners are not so well served (2). The number of definitive studies is very small, and these studies are mixed in with preliminary studies (which may eventually lead to clinically important advances), studies that represent false leads, and those that are the end of the line for an idea that was once promising. These studies are often cloaked in highly stylized language that proclaims the importance of each study and disguises the distinctions among them that would allow clinicians to select the most appropriate ones. For example, in the most prestigious internal medicine clinical journals, the number of original studies and systematic review articles that provide reasonably strong signals and are ready for application in clinical practice is less than 1 in every 2 issues (or about 10% of original research and review articles) (3). Small wonder that clinicians feel inadequate to the tasks of finding and contending with research evidence that is published in this way (4).

But this problem has been recognized for some time, and a number of solutions are now available. Such solutions include clinical users' guides to the medical literature, online search strategies for finding clinically relevant reports in MEDLINE, and high-fidelity clinical evidence processing and synthesis services, all of which now make it possible, easy, and often fun for clinicians to get the evidence straight in “real time.” Space constraints do not allow us to provide every detail on these innovations, but we will list many of them and our perceptions of what their roles should be.

Users' guides to the medical literature

Almost all advances in medical knowledge are first presented in the medical literature in a form that permits detailed examination by practitioners. Moreover, although many information services exist that claim to keep practitioners up to date with these studies, only a few of them incorporate explicit criteria for selecting scientifically strong and clinically sound studies and reviews. For these and other reasons, practitioners must be able to recognize for themselves the research that is ready for clinical application. During the past 2 decades, various guides to critical appraisal of research evidence and users' guides to the medical literature have been published. We have cited the most recent journal series that appeared in JAMA (5-16), and a further revision is available on pocket cards that accompany a new text on evidence-based medicine 17). Some of the bare essentials related to the validity of published reports are summarized in the Table below.

Table. Bare-Bones Users' Guides for Appraisal of the Validity of Medical Studies

Purpose of study Guides
Therapy Concealed random allocation of patients to comparison groups Outcome measure of known or probable clinical importance Few lost to follow-up compared with number of bad outcomes
Diagnosis Patients to whom you would want to apply the test in practice Objective or reproducible diagnostic standard, applied to all particpants Blinded assessment of test and diagnostic standard
Prognosis Inception cohort. Early in the course of the disorder and initially free of the outcome of interest Objective or reproducible assessment of clinically important outcomes Few lost to follow-up compared with number of bad outcomess
Etiology Clearly identified comparison group or those at risk for, or having, the outcome of interest Blinding of observers of outcome to exposure; blinding of observers of exposure to outcome
Reviews Explicit criteria for selecting articles and rating validity Comprehensive search for all relevant articles

Clinical evidence sources

Until this decade, it was necessary for physicians in most clinical disciplines to fend for themselves when dealing with the medical literature. We have now witnessed the emergence of a new breed of journal in which explicit scientific principles of validity and applicability of evidence (beginning with those in the Table) are used to select articles ready for clinical application from many journals. Structured abstracts are prepared for the studies that meet these criteria and have the most clinical relevance. These “more informative abstracts” (18) provide key details so that clinical readers can determine for themselves whether the findings ought to be applied in their own clinical practice. ACP Journal Club was the first of these publications, with content selected for internal medicine and its subspecialties. Evidence-Based Medicine is built on the same approach but has a broader clinical coverage that includes internal medicine, family medicine, pediatrics, psychiatry, surgery, obstetrics, and gynecology. Similar journals are being prepared for other disciplines, and some existing subspecialty journals have added new departments that contain critical appraisals (e.g., The Journal of Pediatrics and The Clinical Journal of Sport Medicine). It is important for readers to discern which are the true evidence-based services from among a burgeoning host of pretenders. If each issue of a publication does not provide explicit rules for critical appraisal and article selection from a list of specified journals, clinicians who value their reading time would do better to invest their time (and money) elsewhere.

It is important to make a distinction between “current awareness” and “look-up” resources. Current awareness is important, and these new critical appraisal journals will alert us to innovations that have been properly tested for clinical application. But it is hit or miss whether the news of these advances will arrive when we are managing a patient with an active problem who might benefit from the new knowledge. Most of the clinical questions that arise several times a day in our practices (19, 20) can only be answered in compendia of current best evidence—and only an electronic compendium is likely to be up-to-date and easily searched.

The Cochrane Library (21) has become the premier compendium of systematic summaries of evidence about health care interventions. It contains reviews prepared by Cochrane review groups (22), other systematic reviews that have been published in the medical literature (23), and a huge database of clinical trials. The Web site is another evidence-based compendium and includes the cumulative contents of ACP Journal Club and Evidence-Based Medicine, with updating of the least recent contents.

For many clinical questions, these new compendia do a much better job of filling the information needs that MEDLINE and EMBASE have filled in the past for electronic access to the medical literature. These bibliographic databases contain virtually all of the signals included in the newer compendia and much more—too much more, in fact. Vast bibliographic databases have intractable indexing problems that undermine the success of searches for relevant studies (low sensitivity), while retrieving many studies that are not relevant (low specificity). MEDLINE and EMBASE are invaluable resources, however, and have a broader scope and depth than any alternative to date. Ways have been developed to trick them into providing a higher yield of clinically relevant articles. For example, the single best methods terms to include in searches of MEDLINE to find high-quality studies for clinical practice are “CLINICAL TRIAL (PT)” for treatments; “SENSITIVITY (TW)” for diagnostic tests; “RISK (TW)” for etiology; “EXPLODE COHORT STUDIES” for prognosis; and “META-ANALYSIS (PT) OR [REVIEW (PT) AND MEDLINE (TW)]” for systematic review articles. Strategies that are more complex have been described in previous issues of ACP Journal Club (24-28) and Evidence-Based Medicine (inside the back cover of selected issues). The Internet appears poised to become the preferred route for accessing information, and medical information, including MEDLINE, is no exception. A recent editorial described some of the Web sites 29), and the American College of Physicians's Web site ( has activated all of the links from that editorial. Users can also access them via the Centre for Evidence Based Medicine at Oxford (

The complementary resources just described, combined with modern information technology, provide practitioners with unprecedented access to current best evidence. Physicians aspiring to provide evidence-based health care can thrive on knowing the basic principles of critical appraisal, subscribing to an evidence-based current awareness service appropriate for their discipline, accessing the new compendia, and using methods terms to search MEDLINE when the more tailored services fall short of meeting evidence needs.

In the next article in this series, we will move from evidence to clinical policy. In the meantime, happy hunting (for evidence, that is).

R. Brian Haynes
David L. Sackett
J. A. Muir Gray
Deborah L. Cook
Gordon H. Guyatt


1. Haynes RB, Sackett DL, Gray JM, Guyatt GH, Cook DJ.Transferring evidence from research into practice: 1. The role of clinical care research evidence in clinical decisions. ACP J Club. 1996 Nov-Dec;125:A14-6.

2. Haynes RB. Ann Intern Med 1990;113:724-8.

3. Haynes RB.Where's the meat in clinical journals? ACP J Club. 1993 Nov-Dec;119:A22-3.

4. Williamson JW, German PS, Weiss R, Skinner EA, Bowes F 3d. Ann Intern Med. 1989;110:151-60.

5. Guyatt GH, Rennie D. JAMA. 1993;270:2096-7.

6. Oxman AD, Sackett DL, Guyatt GH. JAMA. 1993;270:2093-5.

7. Guyatt GH, Sackett DL, Cook DJ. JAMA. 1993;270:2598-601.

8. Guyatt GH, Sackett DL, Cook DJ. JAMA. 1994;271:59-63.

9. Jaeschke R, Guyatt GH, Sackett DL. JAMA. 1994;271:389-91.

10. Jaeschke R, Guyatt GH, Sackett DL. JAMA. 1994;271:703-7.

11. Levine M, Walter S, Lee H, et al. JAMA. 1994;271:1615-9.

12. Oxman AD, Cook DJ, Guyatt GH. JAMA. 1994;272:1367-71.

13. Richardson WS, Detsky AS. JAMA. 1995;273:1292-5.

14. Hayward RS, Wilson MC, Tunis SR, Bass EB, Guyatt GH. JAMA. 1995;274:570-4.

15. Wilson MC, Hayward RS, Tunis SR, Bass EB, Guyatt GH. JAMA. 1995;274:570-4.

16. Guyatt GH, Sackett DL, Sinclair JC, et al. JAMA. 1995;274:1800-4.

17. Sackett DL, Richardson SR, Rosenberg W, Haynes RB. Evidence-Based Medicine: How to Practice and Teach EBM. London: Churchill Livingstone; 1997.

18. Haynes RB, Mulrow CD, Huth EJ, Altman DG, Gardner MJ. Ann Intern Med. 1990;113:69-76.

19. Covell DG, Uman GC, Manning PR. Ann Intern Med. 1985;103:596-9.

20. Osheroff JA, Forsythe DE, Buchanan BG, et al. Ann Intern Med. 1991; 114:576-81.

21. Cochrane Library. London: BMJ Publishing Group. Quarterly publication available on disk and CD-ROM from the BMJ Publishing Group, American College of Physicians, Australian Medical Association, Canadian Medical Association Publications, and Medical Association of South Africa.

22. Fullerton-Smith I.How members of the Cochrane Collaboration prepare and maintain systematic reviews of the effects of health care. Evidence-Based Medicine. 1995 Nov-Dec;1:7-8.

23. Sheldon TA.Research intelligence for policy and practice: the role of the National Health Service Centre for Reviews and Dissemination. Evidence-Based Medicine. 1996 Sep-Oct;1:167-8.

24. McKibbon KA, Walker-Dilks CJ.Beyond ACP Journal Club: how to harness MEDLINE to solve clinical problems. ACP J Club. 1994 Mar-Apr; 120:A10-2.

25. McKibbon KA, Walker-Dilks CJ.Beyond ACP Journal Club: how to harness MEDLINE for therapy problems. ACP J Club. 1994 Jul-Aug;121:A10-2.

26. McKibbon KA, Walker-Dilks CJ.Beyond ACP Journal Club: how to harness MEDLINE for diagnostic problems. ACP J Club. 1994 Sep-Oct;121:A10-2.

27. Walker-Dilks CJ, McKibbon KA, Haynes RB.Beyond ACP Journal Club: how to harness MEDLINE for etiology problems. ACP J Club. 1994 Nov-Dec;121:A10-1.

28. McKibbon KA, Walker-Dilks CJ, Wilczynski NL, Haynes RB.Beyond ACP Journal Club: how to harness MEDLINE for review articles. ACP J Club. 1996 May-Jun;124:A12-3.

29. Hersh W.Evidence-based medicine and the Internet. ACP J Club. 1996 Jul-Aug;125:A14-6.