Why do we care about the foundations of evidence-based medicine (“EBM”)?

Photo via Latvian Ethnographic Open Air Museum

by Samantha Copeland

CauseHealth has been pushing buttons all over the place, lately, as we pursue our goal of critiquing current frameworks in EBM by proposing some deep questions about its ontology*. We aren’t the only ones—at recent events, we have found people from various communities in medicine are ready and willing to raise and engage some tough questions, about what evidence is and how it might best be used. Reflecting on two recent events we attended, in this blog post I want to think a little bit about the relationship between understanding the foundations of EBM, and putting the ideals of EBM to work for us in medical practice.

On May 12, the CauseHealth team hosted a group of keen physiotherapists at a workshop in Nottingham, where leaders of the physio community and philosophers came together to talk about how EBM works, and how it fails to work, in practice. Last Wednesday, several members of the CauseHealth crowd attended a workshop held by the Oslo University’s Institute for Health and Society, about ‘The foundations of evidence-based medicine.’ At both these events, the same debate arose: we want to criticize EBM because of the ideals it proposes, for instance that quantitative evidence is always better than qualitative evidence, but those ideals don’t seem to reflect the actual practice of EBM or the actual methods employed by practitioners. Rather, in medical practice, be it the everyday practice of physiotherapists or the theoretical work done by the people who try to create better guidelines, EBM is seen as including all kinds of evidence, depending on what works best for the job that needs to be done at the time. Many don’t think of EBM as inflexible, and they argue they are already doing what the critics of EBM are saying they should do, incorporating multiple forms of evidence and working to improve their methods as they go.

There seems to be a disjunct between those who want to critique EBM as a theory, and those who are proponents of EBM as a practice. And both sides of this debate seem to be right.

So what is going on? Is EBM accurately described as following a ‘hierarchy of evidence’, or does it appropriately include sources of evidence such as clinical expertise? What version of EBM is the right one? Are critics just being ‘armchair philosophers’ who are ignoring the realities of practice? What about all the practitioners who attend these workshops and present their frustrations with EBM guidelines and restrictions on research?

As Trish Greenhalgh pointed out in Oslo, the ideals of EBM sneak into practice in ‘insidious’ ways: the preference for quantitative methods leads to the rejection of non-conforming articles by leading journals and research proposals by funding agencies. Another point raised in discussion last week gave a hint as to how this insidious creep takes place: several speakers and audience members commented on the fact it is so much easier to produce, disseminate and justify quantitative research today than it is to research and put to use evidence about patient and social values, or how evidence is actually being used in practice.

Quantitative methods fit with our current system, and in turn our current system makes it ever easier to do quantitative research. But this does not always have to be the case. The debate happening about whether the proponents and practitioners of EBM are or are not being close-minded about evidence attests to the fact that changes are happening all the time: despite hierarchies being taught in textbooks and detailed guidelines given for practice, clinical and professional experience teaches practitioners over time how to negotiate, manipulate and move beyond the limits of those hierarchies and guidelines.

So EBM can been seen both as a way to evaluate evidence, abstractly, and as a way to approach the use of evidence—of whatever kind—in practice. The man credited with founding the EBM movement, David Sackett, defended EBM against its first round of critics: “The argument that ‘everyone already is doing it’ falls before evidence of striking variations in both the integration of patient values into our clinical behaviour and in the rates with which clinicians provide interventions to their patients. The difficulties that clinicians face in keeping abreast of all the medical advances reported in primary journals are obvious…” There are two sides to the coin. We need a way to assess what the ‘best’ evidence for a given purpose is, but we also need to find ways to put that evidence to use: and these two things are not always the same, and sometimes they may even seem irreconcilable.


     EBM is a lot messier than this, said speakers in both Nottingham and Oslo


As always, then, the job of the philosopher here is to question. We need to understand better what evidence is, and how the answer to that question may differ, depending on the context we are in. Variations in what practitioners are doing, and how they think about what they are doing, show how important that context is. If medical research is going to produce useful evidence, then less discussion about what is the best kind of evidence and more discussion about what kinds of evidence would be useful needs to guide decisions about funding and publications. Clinicians shouldn’t have to do so much work to make the evidence fit their context, and so the two versions of EBM need to be brought closer. We care about the foundations of EBM because they influence the kinds of evidence that are made available for use in the practice of EBM.

In the words of Tracy Bury, who spoke at the CauseHealth Physio workshop, medicine is no longer about ‘gurus’—now, we assess the way things are done, not who is doing them. The practice and values of medicine have changed, thanks to EBM. I refer readers to an excellent editorial written by historians of EBM, Tim Bolt and Frank Huisman. They suggest that while EBM in its original formulation was excellent for promoting research that helped to eliminate many bad practices in medicine, it is the wrong approach if we want to “procure good medicine. For that we need something else – such as qualitative research, shared decision-making, clinical expertise and patient values and much more.”

It has become obvious to all that the practice of medicine should be evidence based. But now we need to also change the way we approach the very concept of evidence, rather than just distinguishing between theory and practice. Starting with healthcare as a whole (not just doctors and researchers), the discussions happening at these workshops and others show there is a need to start over, looking at what works and asking questions: why does it work, when does it work, and how can we get more of it through better research?


* ‘ontology’ in this case points us to some of the foundational ideas of EBM. The definition of ontology is, traditionally, the ‘study of the nature of being’. The way we are using it in CauseHealth is to point out how the ways that research is done and the kinds of results preferred by proponents of EBM—for instance, the practice of randomized controlled trials and the preference for quantitative methods—are grounded in some pretty specific ideas about what kinds of things medicine is concerned with. That is, people think we should do research in a certain way because there is a certain kind of ‘being’ that is the subject of that research. In the case of EBM, quantitative research tends to be preferred when the medical subject is thought of as chiefly biomedical—a kind of subject that can be described with numbers.

Author: CauseHealth

CauseHealth - Causation, Complexity and Evidence in Health Sciences

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: