There is a big problem with evidence-based practice....

Why evidence-based practice probably isn’t worth it…

problem with evidence-based

The evidence-based practice problem…

 

There is a big problem with evidence-based practice and why evidence-based practice isn’t worth it for most people…

As a pedlar of evidence based practice this post hasn’t been an easy one to write but after years of work in this area, and having a service based around evidence based practice, I have come to a conclusion: There is a really big evidence based practice problem that people outside of health and technology haven’t acknowledged.

In short there is a big problem with evidence-based practice.

 

 

People don’t want evidence-based practice

 

The problem with evidence based practice is that outside of areas like health care and aviation / technology is that most people in organisations don’t care about having research evidence for almost anything they do. That doesn’t mean they are not interesting in research but they are just not that interested in using the research to change how they do things – period.

 

Be impressively well-informed

Get your FREE organizational and people development research briefings, infographics, video research briefings, a free copy of The Oxford Review and more...

Powered by ConvertKit

 

Evidence-based practice is a bit of a fail

 

Ok so lets look at what people at the top of the EBP tree say. Some recognise the problem…

Professor Rob Briner -Professor of Organizational Psychology at Queen Mary University London has a hobby horse and that is that almost every conference he attends is populated by evidence-based practice converts, people with skin the the game. People like researchers, or organisational development professionals or academics or practitioners turned academics.

 

Rob Briner

 

Very few conferences have significant numbers of managers or workers from the ‘shop floor’ or operations’.

Attendance at such events and engagement generally appears to be largely confined to the academically minded converts outside of health and technology.

 

 

So why?

 

Considering the point of EBP is to engage operational people in organisations,  to bring the research to them to help them make better decisions and work better, this appear to be a bit of a fail.

To underline that have a look at this…

I posted this: https://www.linkedin.com/pulse/6-important-factors-ensure-change-success-david-wilkinson on LinkedIn

At this moment in time to post has had 1 like, 2 comments and 1 share. Ok not great figures.

 


LinkedIn Post

 

 

The first comment was this:

 

Comment

 

 

As you can see, not exactly a ringing endorsement for evidence based practice!

… and it got more likes than my original post!!

 

 

So who does engage with EBP?

 

Well largely, but not wholly occupations that have close ties with academia. I lecture in the Medical Sciences Division at the University of Oxford and a fair bit of my teaching is conducted in the hospitals in and around Oxford.

The professors and many of the lecturers are mainly, but not wholly, practicing medics. The University and the hospital, particularly the John Radcliffe or JR is actually part of the Oxford University Hospitals group. The same applies at the other university in Oxford, Oxford Brookes University that looks after nursing which is also heavily evidence-based.

 

 

Oxford University Hospitals

 

 

Close academic – practice connection – really close

 

In fact they are so closely allied that it is impossible to distinguish between the academic side and the practice side. This I believe is an important aspect of this. The practitioners are the researchers and teachers and the researchers and teachers are often practitioners to ex-practitioners. It is hard to slide a sheet of paper between the two.

My lectures and session are in the operational context. My students are often on duty. They are doing the do, as they say. The people I teach are the next batch of lecturers and researchers, but they are the practitioners. Little separation.

Now this doesn’t mean that we don’t have partitioners who don’t teach and we don’t have researchers and lecturers who don’t practice. We do. Lots in fact.

 

But, and this is important –

 

 

The teachers and researchers are embedded in the culture of practice

and

the practitioners and operational staff are embedded in a culture of evidence and research.

So is it just a question of culture?

Well there is something else…

So what about the sectors, industries and organisations where EBP isn’t taking off?

Part of the the problem in the areas, sectors, industries and organisations where EBP isn’t taking off is this…

Who cares about using evidence for anything?

Largely, outside a few industries, organisations and the odd zealous individual or group, people don’t give a flying fig about evidence based-practice, what it means or even what it is about.
The majority of people in organisations are getting on with it and to be honest, just don’t need something else to make things harder or more complicated.
I can feel the evidence-based practioners shouting “But, but, but….” about now. So look at the evidence. Are the people in your organisation battering your door down chattering with excitement saying things like “please give us more about evidence-based practice” or “OMG I am just desperate for some more evidence about this”. Well? Need more evidence ?
Search volumes
Ok let’s look at some search volumes for evidence-based practice. Here we turn to some data from google. Here are the search trends for some topics…
Evidence-Based Practice
This is the search trend data for the term evidence-based practice over the last 5 years:
Evidence-Based Practice over last 5 years
looks reasonably health. If anything there has been a slight decrease.
Lets widen the search to see how many more searches are being made…
Evidence-Based
Here is a comparison between the term evidence-based and evidence-based practice.
evidence-based
as we would expect with a broader term, higher search volumes. Oh the big dips are Christmas by the way… which says something about who is doing all the searching. More of this later.
The search for research
So lets see. I suppose this whole EBP thing is part of the search for research. So lets add research to the search volumes and see what happens…
  Research search volumes
As we might expect with something as broad as the term ‘research’ it has even higher reach volumes. However something interesting has happened – the term ‘evidence based practice’ disappears comparatively and the term ‘evidence-based’ almost flatlines. EBP really is a bit of a backwater compared to the idea of research.
So lets try to add a bit more perspective
Management is searched for more than research and way more than EBP
Management is searched for more than research
And the weather is a lot more interesting than research and as for evidence-based practice is concerned… 
And as for porn…
porn
…no competition!
Yup people are a million more times interested in porn than Evidence-based evidence. People just don’t give a monkeys toss about evidence-based practice, or you or what you are doing. When the weather is more interesting than what you are trying to do you have a problem….
But why? Well here is a clue…
So I thought I wonder what is more popular than the weather, more popular than things like Miley Cyrus, Donald Trump, the news and yes even more popular than sex and porn?
It’s this….
How to

How to

People want to know how to do things… practical things.
Eh?
But at the bottom line isn’t evidence-based practice about how to do stuff?
Yes.
So what’s the disconnect?
Wrong language – wrong planet
Because you are talking the wrong language. In fact you are not just talking a different language to normal people, organisations, you are a different species on a different planet, in a different universe… and guess what.
Talking louder isn’t going to fix this one.
Let me show you something…. let me show you how bad this is…
See this?
Briner 2014

 

So before I start on this let me first make an apology…. to Rob Briner. What you see here is a symptom of the cause of a lack of engagement.

Ok so let me start by saying I am not critiquing Rob or his message. I actually agree with most (not all) of what he says. Hey this is academia. Its all about the argument..

I get this in terms of rigour. It’s not without it’s problems – more of that later… but this roughly represents state of rigour in EBP.

But compared to porn and the weather it’s a bit freaking boring… no actually it’s a lot freaking boring unless you are an active researcher

and that is part of the problem – but before I advance this lets look at something…

The point Rob makes with this is that the lower we go down the hierarchy we go the more bias… well thats not quite true actually its an increased chance or risk of bias.

 

 

Bias increases

 

 

As we go up the hierarchy the better the chance or risk of finding the factors involved in causality also increases:

 

 

Causality

 

 

Ok all good yes?

So why is this a pyramid? Why is there more of the opinion anecdotes etc and fewer systematic reviews?

Well a couple of reasons…

 

 

Research dependencies 

 

The first thing is that the research at the top depends on the stuff at the bottom.

You cant do a systematic review if there isn’t anything to review. Research works up the hierarchy, So, someone notices something and says thats odd or thats interesting and and mention it to someone else. They do a case study or write a blog. That gets picked up by someone doing research who says lets see what evidence there is for this and so on up the ladder.

 

Research dependencies

 

In effect the things at the top DEPEND on the flow up the hierarchy from below them. As I said you cant have a systematic review if you haven’t got anything to review. So look at this…

 

 

How people make decisions at work

 

CIPD figures

 

This comes from the CIPD HR Outlook Winter 2016/17 Report and is the result of a survey of 629 senior HR professionals and looks at how frequently they use what source of evidence when making business decisions.

 

 

80% of operational decision-making

All of these …. 80% of this chart are the very things that are on the very bottom rung of the Briner’s Hierarchy of evidence.

80%

 

Research based decisions

 

And then you look at the numbers of people using these…

 

Research based decisions

 

 

The figures tell a powerful story…

 

The figures speak for themselves…

At the bottom 22% using research always or often and

at the top 77% are using personal experience always or often

 

Story

 

Its not even close…

 

 

Intuition beats evidence-based practice into a cocked hat…

 

Even intuition gets a better press than research!

 

Intuition

 

But why? Are these uneducated, ignorant people?

I don’t think so. Many will have degrees and higher degrees like a Masters. Remember this survey was with managers and leaders….

It’s not a lack of education…

But I think it may be something else…

 

 

The world’s oldest profession

 

The world’s oldest profession…. and it’s not what you are thinking…

 

It’s the biggest industry on earth – bar non.

Story telling

Story telling

 

Every book,

every film,

every news item,

and just about every Facebook post and every tweet is a story.

Social media are all about stories…

 

 

We live in a world of anecdotes and stories

 

 

What do you notice at the bottom?

 

Briner 2014

 

Stories…

We as humans are preprogrammed to tell and listen to stories. You probably spend more time every single day of your life day listening to, creating and telling stories than all the other activities put together.

So why are we surprised that this is the number 1 source of evidence for work based decisions?

 

So if we want to get people interested in Evidence Based Practice hitting people over the head, telling them their evidence is crap, or that there is no evidence for what they are doing isn’t really going to win many friends or influence them…….,

 

maybe just maybe we should be flipping this on its head…

 

flipped

 

 

Its a funnel. A funnel of engagement. In the entrepreneurial side of my world there are things called marketing funnels…

 

 

funnel

 

 

Awareness, interest, consideration, intent, evaluation, purchase… all that has to happen before anyone will buy your product or

idea

 

People aren’t going to buy Evidence based practice until they are convinced it will solve their problems

 

But and its a BIG but….. You are not selling evidence based-practice though… remember this?

 

Monkeys

 

You are better off talking about porn or the weather rather than evidence-based anything…

PEOPLE AREN’T INTERESTED in EBP!

 

So don’t even try to sell it. There is no point….

 

…at first at least.

 

 

But we do know what they want,

what they really really want…

 

 

inform business decisions

 

…inform business decisions…

Even more than porn and the weather…

 

 

How to do stuff

 

Stuff like… How to make good decisions!

 

 

STOP beating people up and for the love of …

STOP selling Evidence-Based Practice

 

 

Use these…

 

 

Use these

 

use stories, case studies and anecdotes to capture people.

Get their interest first.

 

 

Stories

 

 

Tell stories, capture interest, stop being stuck up about systematic reviews, this method or that method… thats for later…

Research methods for most people

is boring

and

most people don’t care!

You ain’t going to capture anyone like that.

Thats for the people who buy evidence-based practice…

but before then…

slay some dragons

 

go out and have some fun,

slay a few dragons,

help people to do stuff bit by bit.

Slowly slowly draw people in.

 

That’s it! I hope this has been interesting but more than anything I hope it has sold the bottom of the hierarchy, the vital importance of story telling, anecdotes and case studies in selling Evidence-Based Practice.

 

 

Be impressively well informed

Get the very latest research intelligence briefings, video research briefings, infographics and more sent direct to you as they are published

Be the most impressively well-informed and up-to-date person around...

Powered by ConvertKit
Like what you see? Help us spread the word

David Wilkinson

David Wilkinson is the Editor-in-Chief of the Oxford Review. He is also acknowledged to be one of the world's leading experts in dealing with ambiguity and uncertainty and developing emotional resilience. David teaches and conducts research at a number of universities including the University of Oxford, Medical Sciences Division, Cardiff University, Oxford Brookes University School of Business and many more. He has worked with many organisations as a consultant and executive coach including Schroders, where he coaches and runs their leadership and management programmes, Royal Mail, Aimia, Hyundai, The RAF, The Pentagon, the governments of the UK, US, Saudi, Oman and the Yemen for example. In 2010 he developed the world's first and only model and programme for developing emotional resilience across entire populations and organisations which has since become known as the Fear to Flow model which is the subject of his next book. In 2012 he drove a 1973 VW across six countries in Southern Africa whilst collecting money for charity and conducting on the ground charity work including developing emotional literature in children and orphans in Africa and a number of other activities. He is the author of The Ambiguity Advanatage: What great leaders are great at, published by Palgrave Macmillian. See more: About: About David Wikipedia: David's Wikipedia Page

  • Pam says:

    Great article. I remember feeling a huge sense of relief when I realised I could do a qualitative research study as part of my MSc without the need to get my head around quant and stats! I’ve worked as an organisational consultant to business for more than 25 years. In practice have I noticed a big shift in clients demanding evidence – not so much. It’s useful to quote research and even to volunteer to do pilot research first but once you’ve sold in they are not that interested. In fact since the 90s I have seen a reluctance of businesses to invest in their own research (eg in-house well researched competency models gave way to off the shelf tweeked models from anywhere). Organisations are often not investing in their people’s development and relying on Google searches to find out how to design their organisation, develop their people, recruit them very cheaply. So in some ways I see things going the other way. But hey, that’s just an anecdotal personal experience so do take this with a pinch of salt. To the uninitiated statistics/evidence based phobic I would recommend reading Richard Nisbett’s Mindware. Introduces stats, probability, questioning techniques in easily accessible ways. Highly recommended.

    • Thanks for this Pam. Yes Mindware is a good book. Interestingly we are seeing quite a rise in interest in things like The Oxford Review and other research based sources. The police and managers in particular are embracing EBP in greater numbers than a few years ago. The pendulum may be swinging.

  • Jon says:

    Good stuff. Thank you for a very enjoyable and informative read!

  • Leon Casaubon says:

    Most people who try to implement new things related to measurement based on statistics find that the tests they conduct can lead to evidence, yet the tests themselves may not be complete enough. I have spent quite some time implementing rehabilitative programs for offenders. While the evidence-based programs (EBP) are good, I believe there is much more validity in outcomes for a simple factor that is much more difficult to measure – how much the care givers care. I hope we continue to progress on EBP. It certainly is not for everyone.

  • David Swaddle says:

    Is it surprising that people use personal experience more than evidence based research? I don’t think so.

    The first time somebody does something new they’ll usually find somebody else to ask. That person’s ideas may or may not be from EBR. If they can’t find somebody to ask, then they’ll usually try and find out how to do what they want to do, which these days means a web search. When they find a web page, Wikipedia article, or book, it might be based on EBR or might not. The searcher might be good at evaluating the quality of the search results and pick something from EBR. Let’s say it’s 50:50 whether they’ve found something based on good evidence (likely it’s a lot lower chance).

    Next time the person goes to do the same thing they’ll probably do it based on the experience of the last time they did it. To most people their ‘how to’ is now based on personal experience (#1) not EBR. If a colleague comes to this person for advice then that colleague will say their answer was based on the judgement of an experienced colleague (#2) rather than EBR.

    Marketers know all about these patterns – that’s why celebrity endorsements are so important. Dr Dre puts his name on headphones then they must be great headphones because Dr Dre is all about sound. Before you know it people like me, who’ve never worn a pair of Beats, know that they’re meant to be really good. Viral marketing kicks in and soon you’re making billions and selling out to Apple.

    If you want to spread EBR in organisations you work with then think about who are the rockstars. Work with them to spread the messages in the EBR.

    People don’t have to know why stories are great (they are) – all they have to know is that stories are great. If {insert favourite leader/educator/corporate guru} says that stories are great to spread good practice then their followers will soon be using stories to spread good practice without knowing the EBR it’s based on. Unfortunately the same pattern applies to snake oil.

    So don’t despair. Be very happy that 1 in 5 business people will usually try and use scientific evidence first hand. That’s a good number and means that researchers aren’t as isolated from business as they could be.

  • Dr Henry Hornstein says:

    This article and the few comments that have been posted are nothing more than rationalizations of maintaining the status quo. Consultants do more than giving clients what they think they want and doing things how they have always been done.

  • Morgan David says:

    I have set up a consulting business aimed at applying evidence-based social psychology techniques to issues organisations face. As you may guess, I cannot really go the opposite direction : managers, and people in general, don’t care much about the methods and techniques you can use, they want results! Whether you use carefully designed randomized-controlled trials or tarot cards isn’t of interest for most of them (and believe me, the situation is even more critical down here in France…). The good news is, in my opinion, that things are starting to change, but at a ridiculously slow pace. Actually, the more investments (financial or else) are at stake, the more people may want to rely on solid evidence, and this is where we come. Despite this, and even if on average EBP are more welcome, the split between “rational decision-makers” and people influenced by pseudo-scientific disciplines (have you ever heard of geobiology???) becomes wider over time.
    To conclude, building an elegant storytelling around our methods and practices is of course essential and necessary for all the reasons described above. That may sound as frustrating to us as our reliance on EBP is our distinct hallmark. I used to make a point highlighting it until I realized that it had no additional convincing power and may even put people off… So let’s keep on emphasizing the very results our methods can lead to (quantitative evidence may sometimes be useful), and also pointing the absence of results pseudo-scientific methods get. That may be easier to do in social psychology than in other fields though worth trying. In addition that makes you focusing on what your client will get rather than the technique you’ll use…!

  • It is a good article. Even in higher vocational education, where I work, most people are not really interested in EBP. It can happen that on day 1 a researcher gives a presentation about the best ways of testing knowledge of students – completely with research data, and also about things that are definitely not a good idea. On day 2 someone from the ‘policy department’ comes with exactly opposite views, with no evidence. But ‘he believes in them ‘. Or is this just happening in The Netherlands?

  • Nick Shackleton-Jones says:

    I enjoyed this piece. If I had to summarise it, you seem to say that business (and business people) are persuaded by narratives rather than research.

    Although this is certainly true in part, I think it misses a fundamental point: the situation is more analogous to a bridge-builder and a physicist.

    The bridge-builder is tasked with using available materials to build a structure that will carry people over the bridge. The physicist points out that reliable research regarding the performance of the materials has yet to be carried out. The builder shrugs and builds the bridge anyway.

    The builder is not contemptuous of research – he may well have several degrees in physics. He will also have many years of practical experience. His challenge is to build something that works today, that his academic colleagues can explain tomorrow (not the other way round). His practical know-how is possibly decades ahead of the research. (This point is made by Nassim Taleb in his book ‘Antifragility’ under the heading ‘teaching birds how to fly’. Taleb uses the example of the jet engine, which was the result of tinkering, and only subsequently explained scientifically).

    Of course you object ‘but do these approaches (engagement, talent etc.) REALLY work?’. Yes, they do – so far as the business is concerned. If it is experimental conditions you are looking for, that will be to satisfy your own curiosity. A business is rarely a controlled experiment; with the result that ‘findings’ don’t translate readily from one to the other.

    If you want a sound-byte, maybe it is time to consider ‘practice-based evidence’?

  • Andrew Abraham says:

    I like what you’re saying and largely agree. However even EVP is a academically coined phrase. Of course people operate on evidence, (as you actually address) our beliefs and values are formed through evidence. The issue is that it is not the sort of evidence that academics typically like. I often despair of articles that get to the end of a ‘good science’ article and chuck in some lame one liner at the end that says something like ‘and *insert name of practitioner that the researcher thinks they’re writing for but have actually got bugger all clue about the reality of* should therefore *insert some patronising massively decontextualized suggestion*.

    I and my colleagues do seem to get some success working with ‘practitioners’ in trying to add ‘criticalness’ to their judgement and decision making. I believe we get this success for 3 core reasons
    1.We try to understand the world they operate in and the challenges they face
    2. We acknowledge they have got access to far more evidence based on their experiences than we can ever point them to in the literature
    3. We offer theories which we think relate to their field, have some evidence to support their connection to that field but then challenge practitioners to see if these theories help them make ‘better’ sense of their evidence (experiences) in order to encourage them to think more clearly and deeply about what they are doing
    4. We offer a schematic that we think helps them see the wood for the trees and therefore bring some coherence to their process and practice.

    Funnily enough we don’t yet have ‘evidence’ to support or challenge these assertions yet but hopefully will soon.

  • Jonathan Cormack says:

    I enjoyed this article. I like the vote for storytelling. I would add that, in my experience, involving the operations folks deeply in any study is important. Putting them in charge of data collection and analysis and in forming conclusions arising from this. Much easier than “selling” in research findings afterwards.

    An idea combining both storytelling and line involvement is a great method called “most significant change”. As well as promoting involvement it also lends itself to a strengths based approach as opposed to the common deficit orientated research models.

  • Marie-France says:

    Yup agree with everything! in fact, to most people, an anecdote is ‘evidence-based’ since it actually happened and to someone I know or known to the person telling me the anecdote to boot….. It doesn’t matter that it only happened once in a blue moon! if it worked like that for them it’ll work like that for me and I’m willing to give it a try! it’s what most people use when arguing a point in any social setting – anecdotes!
    In fact just turn Fig. 10 over and there’s your pyramid!
    and don’t get me started on correlation vs. cause and effect!

  • >