Current issues of ACP Journal Club are published in Annals of Internal Medicine


Editorial

What makes evidence-based journal clubs succeed?

PDF

ACP J Club. 2004 May-June;140:A11. doi:10.7326/ACPJC-2004-140-3-A11



Do you catch up on valuable rest time once a week at your local journal club? Or doze while somebody presents an article that has been allocated to them, without reference to “question,”“search strategy,” or “assessing performance”? While the rest may bring health benefits, it is unlikely to advance the quality of care. Evidence-based journal clubs, however, have documented benefits (1).

Having made most of the possible “errors,” we’d like to share some tricks and traps that we think make evidence-based journal clubs work or not. We have gathered our information from personal experience, a systematic search of the literature, and stories we have been told by colleagues and members of the evidence-based health care mailing list (see Acknowledgments). One of us (PG) runs an evidence-based journal club in general practice; the other (RSP) runs an evidence-based journal club in the pediatric department of a teaching hospital (2) and facilitates journal club meetings for pharmacists. While running these disparate events, we, quite separately, stumbled on some of the same tricks and traps, many of which are supported by the findings of a large survey of the factors that predict the life span of (any) journal club (3).

Organizing journal club sessions

The structure of successful evidence-based journal clubs varies. Commonly, the clubs run in a cycle. Our own medical journal clubs run over the same 2-session cycle (Figure). The cycle may be weekly, but other time frames are possible. The last 10 to 15 minutes of the session are spent discussing participants’ real clinical problems and defining the structured clinical questions that would help address these problems. A process of moderated “voting” on the questions selects the most popular ones, and then someone is assigned the literature search as homework. The first 45 minutes of the next session are spent appraising and applying the studies felt to represent the best answers to the questions raised in the previous session. If there is a wide range of studies, the work may be spread across several sessions.

Initially, we ran the clubs over 3 cycles (question, search, and appraisal) (Figure), and each cycle included a review of the search strategy. However, this led to boredom, particularly if there were problems along the way. Assigning a facilitator to assist with searches during the week (between sessions) helped a little. Searching may be the weakest part of the evidence-based journal club experience, however (4), so it could be that this is counterproductive.

There are many variations on this structure. Some clubs run on a 3- or 4-session cycle (with different combinations of question generation, search, appraisal, and presentation of a critically appraised topic [CAT]). Other clubs decide which question from which study will be discussed before the session, then distribute the article and critical appraisal worksheet to the participants. Each minigroup of participants (2 to 3 people) is allocated a part of the appraisal as their task, and the club begins by collating the answers to kick-start discussion. Another hospital-based group runs a “reverse journal club”: The presenter asks a clinical question and then asks the audience what type of study design would best answer the question. This question-and-answer process builds the framework to critically appraise the chosen article. The preselected article is then handed out, and the appraisal is virtually complete.

Yet another approach is to use a presentation, where the speaker guides their audience from the clinical scenario, through the question formulation and search strategy, to an appraisal and generates a CAT, which is then made available on a Web site (5). Finally, a recent development is a “virtual” journal club on the Web, for which a good example exists in pediatric critical care. In this model, participants sign up to do the primary appraisal of an article, and the discussion is run with moderated comments attached to the appraisal. The great advantage of this model is the number and diverse locations and time zones of the participants.

Assigning roles

Running a journal club involves allocating several roles. In addition to the presenter, the group needs a facilitator to help the discussion along and focus the group on its task. A scribe is helpful in recording the discussions of the group, including creating a CAT. A host may be helpful to introduce new members (and pass around snacks!). Someone needs to provide administrative support, providing copies of the article and critical appraisal sheets. How these roles are filled differs among groups, but the most successful groups have a fixed facilitator, who organizes the other roles. Some groups have a flipchart scribe who facilitates discussion; others have a member using a data projector and CATmaker (www.cebm.net/downloads.asp) who takes notes and builds the group CAT. The nature of the group (e.g., hierarchies, location, critical appraisal knowledge, and skill mix) affects how the roles are distributed.

Traps

We found a few things that didn’t work in an evidence-based journal club, and a number of things that probably helped a lot. One thing that we, and others, have found difficult is trying single-handedly to induce a traditional journal club to perform critical appraisals as a small-group learning session. In one case, humiliation followed (5). Enthusiasm needs to be combined with facilitation skills and an appropriate structure. Especially with clinicians who are new to the processes of evidence-based medicine, pressure to finish the paper led people to skip the appraisal and focus on results as they would have in a traditional setting.

Sending out articles before the journal club meeting seems to have mixed results, but more negative ones than positive. In our experience, expecting people to independently read articles before a regular meeting (and bring their copy with them) is a waste of time and paper. At most, 20% read the paper. If you then leave time for the rest of the people to scan the paper, the ones who already read it get annoyed. If you leave no time to read the study, then most people are left adrift and are less likely to return.

Tricks

On the other hand, a number of tricks seemed to help. Answering individual questions is central to both education and motivation. But make sure in your early sessions that you have a “planted” scenario or question in your group. Early on, people seem keen to come up with questions focused on the rare, unusual, and wonderful diagnoses they have bumped into rather than questions about their everyday practice. Being 1 step ahead with prepared dilemmas and questions about asthma, diarrhea, or earache helps a great deal. If your group votes on which question to choose, you can summarize the clinical questions and add comments about the likelihood of success to sway public opinion.

Providing food at an educational meeting improves attendance (3); once people have turned up, it’s much easier to try to turn them on to whatever the topic is.

Use good signposting about when and where the club meets, what the topics are, and the probable relevance to everyday work to improve attendance.

Start your sessions with a review of the clinical question, and allow 5 minutes to scan through the chosen articles.

In addition to having enough copies of the week’s article, having a backup article (or articles) in your bag is essential. There will be times when a good question with a good search leads to no articles, or 1 with a 3-week lag time in getting a copy from the library. Having nothing to do can kill momentum and people will drop out of the club. A store of little gems goes a long way to counteract this. The articles we have stockpiled include good clinical information, great teaching points to help with the methodology of appraisal, and pages from current issues of Evidence-Based Medicine.

If the article being reviewed seems only vaguely related to the question, take the opportunity to critically appraise the article’s methodology more deeply to try to get some learning out of the session. It is useful to have photocopies of 1-page appraisal tools or the EBM validity criteria to pass around.

Create a learning logbook of CATs as you go along, on a computer if possible. This gives your club a tangible product and a reference to reread when the question is asked again in a month and no one can remember the answer.

Finally, it’s useful to end the session by asking everyone for their clinical “bottom line.” You might even want to follow this up with group decisions on actions needed to implement the evidence (e.g., put up a flowchart or buy the necessary equipment) and possible monitoring items (e.g., proportion of patients on aspirin or podiatry referrals).

The common themes in successful journal clubs seem to be that they are truly question-driven and appraisal-focused and seek to generate a written record (often as a CAT, or sometimes a BET [Best Evidence Topic ; www.bestbets.org]). Enthusiasm and relevance (and free food and drink) all seem to encourage clinicians to take part in these educational events.

Journal club principles

Focus on the current real patient problems of most interest to the group.

Bring questions, a sense of humor, and good food.

Distribute (and redistribute) the time, place, topics, and roles.

Bring enough copies for everyone of the week’s article and a backup.

Keep handy several copies of quick (1-page) appraisal tools.

Keep a log of questions asked and answered.

Finish with the group’s bottom line and any follow-up actions (e.g., tools, flowcharts, audits, further searches).

Acknowledgments

We thank the following individuals among others who have helped with our research: Anne-Marie Bagnall, Mike Bennett, Mike Crilley, Kev Hopayian, Rod Jackson, Barry Markovitz, Victor Montori, John Nixon, and Mike Smith.

Robert S. Phillips, MA, BM, BCh, MRCPCh
Centre for Evidence-Based Medicine
Oxford, England, UK

Paul Glasziou, MBBS, PhD
Centre for Evidence-Based Medicine
Oxford, England, UK


References

1. Ebbert JO, Montori VM, Schultz HJ. The journal club in postgraduate medical education: a systematic review. Med Teach. 2001;23:455-461. [PubMed ID: 12098365]

2. Phillips B, Butter J, Collins C. Journal club. Acta Paediatr. 2001;90:592. [PubMed ID: 11430726]

3. Sidorov J. How are internal medicine residency journal clubs organized, and what makes them successful? Arch Intern Med. 1995;155:1193-7. [PubMed ID: 7763125]

4. Coomarasamy A, Latthe P, Papaioannou S, et al. Critical appraisal in clinical practice: sometimes irrelevant, occasionally invalid. J R Soc Med. 2001;94:573-7. [PubMed ID: 11691894]

5. Thayyil S. Are we ready for evidence based medicine? e-letter. Arch Dis Child. 2002.



Figure. Alternative sequences for journal club sessions: 2-cycle (upper) and 3-cycle (lower) structures.

figure