Chapter 1
Evidence-based Practice
One exhibition was doing ear candling for $30. The people selling this said that the suction created by the candle ‘cleared your mind and sinuses.’ I questioned them enough to establish that they meant this literally and believed the ear was an opening from the brain and sinuses. The woman running the booth stated, ‘It cleans the whole head, brains and all – they’re all connected you know.’ The candling was performed on a table at the front of the booth, so the curious sight of a person lying there with a burning candle sticking out of his ear drew many spectators. During the procedure, a gray mixture of soot and wax drippings collected on a pie plate under the candle. It did not look like melted candle wax, but was quite foul in appearance. Customers were told that these were the ‘impurities’ of which they had been cleansed, and many went around proudly showing them off, comparing their debris to that of others, and making knowing comments.
Two investigators tested candles to see whether the wax accumulated after burning came entirely from the candle or included wax that came from the ear. To do this they burned candles with the tip (a) inside the ear, (b) outside the ear, so the wax dripped into a bowl of water, and (c) inside the ear but with a tube in place that would permit ear wax to move into the tube but would block candle wax from moving downward. They demonstrated that all residue originated from the candle and that no ear wax was removed from the ear. (Roazen, 2007)
Introduction
Evidence has of course been collected and used by medical practitioners and others for a very long time, but it has not always been applied systematically. Sometimes tradition and personal experience have become a dead weight, stultifying alterations to practice even when the evidence indicates strongly that change is needed. Gambrill remarks:
Consider the experience of Ignaz Semmelweiss who, around 1840, discovered that the death rate of mothers from childbed fever markedly decreased if surgeons washed their hands before delivering babies. [Yet] cleanliness was not taken seriously by the medical profession until the end of the century. (Gambrill 1999: 2)
However, it is perhaps not surprising that when ‘evidence-based practice’ (EBP) did emerge, it was in medicine and that that remains its core locus. It is generally accepted that the concept of evidence-based practice was first developed at McMaster University in Canada in the early 1990s, broadening out in the mid-1990s to ‘evidence-based healthcare’. There are various definitions, although a widely accepted one is:
Evidence-based practice (EBP) is an approach to health care wherein health professionals use the best evidence possible, i.e. the most appropriate information available, to make clinical decisions for individual patients. EBP values, enhances and builds on clinical expertise, knowledge of disease mechanisms, and pathophysiology. It involves complex and conscientious decision-making based not only on the available evidence but also on patient characteristics, situations, and preferences. It recognizes that health care is individualized and ever changing and involves uncertainties and probabilities. Ultimately EBP is the formalization of the care process that the best clinicians have practiced for generations. (McKibbon 1998: 396)
EBP is now accepted in medicine and healthcare world-wide. In a number of countries, it forms a mandatory basis for practice. It is backed up by a wide range of services, like the Cochrane Collaboration (http://www.cochrane.org/) which provides access to systematic reviews of the medical literature. These reviews search the literature for relevant papers on any particular medical intervention (for example, an operation, a course of treatment, screening for the early stages of a disease or advice on a particular problem), consider each study carefully against predefined criteria, aggregate data (where possible) on outcomes into a meta-analysis, summarise the findings and provide a discussion of the implications by experts. Healthcare practitioners world-wide use these reviews to ensure that they are up to date with latest best practice.
The ‘gold standard’ for research studies is the Randomised Controlled Trial (RCT), which uses a process designed to exclude, as far as possible, sources of bias and thus to establish a clear cause-and-effect relationship between the intervention and the outcome – for example, between the taking of a drug and the change in a medical condition. In an RCT there is, first of all, a control group. Trials, say of a new medicine, involve giving one set of patients the drug while another set is given a placebo. As far as possible, the two groups need to be identical in their characteristics, so the sample is randomly split between the two groups. The trials are also double-blind, that is, a researcher who knows which is the real drug and which is the placebo numbers each dose randomly, maintaining a list of which is which. A different researcher or doctor, who administers the drug to a patient, only has the numbered doses and records the patient’s name or other details against each number but has no way of knowing which are the drugs and which are the placebos, and cannot therefore tell the patient or even reveal this information unwittingly, say by facial expression or gesture. The outcomes are monitored and the data passed back to the first researcher, who can assign them to drug or placebo from the master list. The results should be free of bias introduced by knowledge of which procedure has been used.
Controlled trials are not a modern invention. Indeed one is recorded in the Book of Daniel, which was written no later than the mid-second century BCE:
Daniel resolved not to defile himself with the royal food and wine, and he asked the chief official for permission not to defile himself this way. Now God had caused the official to show favour and sympathy to Daniel, but the official told Daniel, ‘I am afraid of my lord the king, who has assigned your food and drink. Why should he see you looking worse than the other young men your age? The king would then have my head because of you.’ Daniel then said to the guard whom the chief official had appointed over Daniel, Hananiah, Mishael and Azariah, ‘Please test your servants for ten days: Give us nothing but vegetables to eat and water to drink. Then compare our appearance with that of the young men who eat the royal food, and treat your servants in accordance with what you see.’ So he agreed to this and tested them for ten days. At the end of the ten days they looked healthier and better nourished than any of the young men who ate the royal food. So the guard took away their choice food and the wine they were to drink and gave them vegetables instead. (Daniel 1:8–16, New International Version)
While that experiment would not meet modern standards (the participants knew which group they were in for a start!), it does illustrate that the idea behind RCTs is an ancient one. In modern times, credit for the widespread acceptance of the technique belongs to Sir Austin Bradford Hill, who published a landmark paper in 1952. More recently, there has been acknowledgement that while RCTs remain critical, the most important issue is that the most appropriate method should be selected for the intended experiment and a wider range of techniques can be quite properly applied. For example, ethical considerations will dictate that patients should not be given a placebo where it is known that their condition is likely to worsen if they are denied the drug itself.
EBP as Process
It is important to recognise that EBP is not just about the evidence itself, but also encompasses the process by which the evidence is gathered and applied. Five steps have been identified as necessary if it is to be effective in terms of patient outcomes (Straus et al. 2005):
1. translation of uncertainty, which may be about a wide range of problems, into an answerable question,
2. systematic retrieval of the best available evidence,
3. critical appraisal of that evidence for validity, relevance, and applicability,
4. application of the results in practice (here EBP practitioners have recognised the importance of integrating the ‘hard’ evidence gathered with clinical judgement and patient values), and
5. evaluation of performance – the effectiveness and efficiency of the process undertaken.
Thus it is the framing of the question and the search for relevant, valid, reliable and applicable evidence to illuminate it which are critical, as is the review of the performance of the application to practice once it has been completed. EBP would therefore expect an iterative approach where practitioners reflect on and learn from their experience.
Beyond Medicine
It is hardly surprising that a paradigm which has proved so popular and become so widespread in medicine should be taken up by other professional disciplines. Some of these are allied to medicine, such as nursing and physiotherapy. Some are associated with it, as with librarianship where information practitioners in health services were charged with locating the medical literature and started to apply the concept of EBP to their own discipline, spawning evidence-based librarianship (EBL) and evidence-based information practice (EBIP). Some are professional disciplines within the social sciences, such as education and social work, which were seeking more systematic ways to address pressing issues of professional practice. Others are of a more general nature, as when EBP is applied to management. Others again are specialised fields. Ray, for example, writes:
It is too easy for professionals to use their specialist knowledge as a means of control. Architecture has been particularly bad at setting out the evidence from which a decision can be taken and at continuing to build that body of evidence from experience in use. [But] the move towards research-based design and evidence-based practice is beginning to be accepted by sections of the profession. (Ray, 2005: xvi)
The growing interest in EBP outside healthcare has not, however, seen the overwhelming take-up that it achieved in its original home. It is not that the idea of basing decisions on reliable evidence is rejected – far from it. But often the complexity of the social setting and the intricacies of contextual factors make it much harder to evaluate the usefulness of a particular piece of evidence in a particular setting. As Coe (1999) remarks in a manifesto for evidence-based education:
Education is so complex that slight and subtle differences in context may make all the difference to the effects of some change in policy or practice. One of the things that makes teaching so interesting is that what works for me may not work for you, and even may not work for me the next time I try it. This makes it seem unlikely that the kinds of simple, universal strategies much beloved by governments will produce the improvements intended, although only by trying them can we really be sure. However, this complexity does not mean that there can never be any worthwhile evidence about anything. A more useful kind of evidence would be that which seeks to throw light on exactly which features of the context are important, to find the conditions on which the outcome depends. When we have this kind of evidence we will understand better which strategies are likely to be most effective in each situation.
Before exploring in more detail the kinds of evidence which are needed to make sense of complexity and context, it is useful to examine just why EBP has succeeded in becoming so popular in its home discipline of medicine. It is curious that what was a major change in the dominant paradigm of medical practice occurred so quickly and spread so rapidly. Why should this have happened? In seeking to answer this question we need to look far and wide, at broad societal trends as well as to the subject itself. Ten areas of exploration suggest themselves.
EBP and Professional Status
Over many years, medical practitioners had elevated the status of their profession until the doctor had become among the most important members of the community. This reminiscence would be typical:
When I was young, we were poor as church mice and we were living in a country town. And there was a sort of holy trinity: there was the headmaster of the school and the bank manager and the local GP (general practitioner) – God-almighty, you know! And it would not matter what sort of clod he was, he was ‘Doctor’. When I was a very small child, if my grandparents, for example, had cause to call ‘Doctor’, the thing to do was to put out a fresh cake of soap and a basin and bring in water for ‘Doctor’. (quoted in Lupton, 1997: 480)
However, by the final quarter of the twentieth century, this societal respect was beginning to break down. In large part this came about through broader changes in society, with shifting attitudes to authority and a much-increased willingness to question accepted norms. In the case of doctors, however, it was also influenced by some well-publicised cases which demonstrated that ‘Doctor’ certainly didn’t always know best. As a result we have at times returned to a more medieval characterisation of doctors, typified by Ben Franklin’s cynical observation, ‘God heals and the Doctors take the Fee.’ Or in Matthew Prior’s lines,
You tell your doctor, that y’are ill
And what does he, but write a bill. (Porter 1994: 1714)
To counter such cynicism, the medical profession has gone to great lengths to protect its reputation by creating robust mechanisms to ensure that only the ‘best’ evidence is used – and that where medical practitioners fail in this requirement they are challenged, if necessary admonished and in extreme cases struck off the medical register. Thus public respect for the medical profession is reinforced by the public’s perception that in cases where they have not acted in patients’ best interests, and particularly where they have neglected available evidence as to the most efficacious procedure, there are mechanisms for admonishing them and in extreme cases preventing them from practising. Even though the public may not be familiar with the concepts of RCTs and EBP, there is an appreciation that in general, even though there are always perceived exceptions, professional standards are applied. EBP underpins the assurances which the public can be given and thus acts to reinforce the notion of the professionalism of the medical practitioner.
The Scientific Research Base
Medicine has developed, and continues to expand, a highly sophisticated and extensive body of scientific research with extensive quality-assurance processes, based on peer review and on mechanisms like the RCT. Comparison between the results of different studies through the process of systematic review has enabled the development of a number of authoritative sources of advice, such as the Cochrane Collaboration referred to above.
It is important to stress that the modern dominance of scientific evidence over folk wisdom, tradition, divine revelation and individual experience is part of a process which has been in place since the Enlightenment of the eighteenth century. The period between the Glorious Revolution in England and the French Revolution (and including the American Revolution), with its emphasis on the role of reason as the basis for authority, found in the West a ferment of new ideas and attitudes. This revolutionary movement, which built its foundations on Immanuel Kant’s motto, sapere aude (dare to know), built a new meta-narrative which privileged scientific knowledge. It mixed classical learning with a willingness to challenge accepted wisdom and formed the dominant values for Western society for the next two hundred years – indeed, we still live with it. As Gay puts it graphically in his major work, The Enlightenment: an Interpretation:
The Enlightenment, then, was a single army with a single banner, with a large central corps, a right and a left wing, daring scouts, and lame stragglers. And it enlisted soldiers who did not call themselves philosophes but who were their teachers, intimates, or disciples… The Enlightenment was a volatile mixture of classicism, impiety, and science. (Gay, 1966: 6)
This is the context within which EBP has risen to prominence. The Enlightenment value of reason has resulted in huge scientific advances that have benefited all of humankind, albeit with many unintended consequences which we now must face, among them climate change and global warming. However, few suggest that such problems mean that we should turn our backs on the scientific endeavour. Rather, the consensus is that we need to redouble our efforts to gather the evidence needed to make wise and far-sighted decisions for the benefit of all. EBP is one manifestation of this imperative.
Information and Communications Technologies
The ability which information and communications technologies (ICTs) have provided to generate huge amounts of information, to process it and to deliver it selectively to a world-wide audience has clearly made a big difference to the practice of medicine. Not only are findings from across the world more widely available but new discoveries can be communicated rapidly to virtually every medical practitioner on the planet. There can be little doubt that the combination of different ICTs (for example, software for data capture and analysis, pre-print and post-print archives, Web publishing, alerting services) has resulted in a much bigger, and more rapidly changing, literature. Purely in quantitative terms, it is reported that
… between 1978 to 1985 and 1994 to 2001, the annual number of MEDLINE articles increased 46%, from an average of 272,344 to 442,756 per year, and the total number of pages increased from 1.88 million pages per year during 1978 to 1985 to 2.79 million pages per year between 1994 to 2001. (Druss and Marcus 2005: 500)
ICTs also enable individual practitioners to manage their access to, and thus use of, evidence. Where in the past medical practitioners might subscribe to a small selection of journals, such as the British Medical Journal or The Lancet, they can now access selected papers from any of tens of thousands of different journals with near-instantaneous delivery (always allowing, of course, for the vagaries of subscription deals). Systematic reviews are similarly available so that the best available evidence for the appropriate diagnosis or intervention can be identified readily, even when the symptoms are unfamiliar. Furthermore, the aggregation of data on interventions and its analysis is greatly facilitated by ICTs, especially when the basic data – patient records – are available in electronic form. All of these advantages lead to greater use of recorded information, which becomes the content on which EBP can be based. The fact that EBP rose to prominence at exactly the same time as end-user computer systems became commonplace is surely no accident.
Information Overload
The increase in the amount of information published, together with developments in ICTs, has led to the commonly experienced phenomenon of information overload. Not only is an almost limitless range of information available for access on the Web, but much information is pushed to the individual through email, instant messaging, SMS text, voicemail, RSS feeds, aggregators and, of course, paper. It is not only the amount of such information which causes problems, but the independence of different systems means that it can be very difficult to sequence messages. For example, is the latest information to be found in the voicemail I just listened to or the email that has just appeared on my screen? How can I sensibly store all these different messages in ways which create meaningful order from them? Why is this message labelled ‘urgent’ and is it rea...