The research-practice paradox

Image

If research already shows what works well, why isn’t it put into classroom practice?

 

A little while ago, I was asked to take part in a Guardian online panel on evidence-based education. Here are some of the questions asked:

 

1. How do we work out what works best, and why it works best?

 

2. What are the best routes to getting research into practice? 
How we can help teachers learn lessons that are useful for their teaching practice from research?


 

3. What role should school leaders play in applying and developing education research for evidence-based teaching practice?

 

The panel consisted of Emily Yeomans, grants manager at the £125 million Education Endowment Fund; Ben Durbin, head of impact at the National Foundation for Educational Research (NFER); Mike Bell, secretary of Evidence Based Teachers Network (EBTN) of 4,500 members; and others including contributions from Geoff Petty, author of Evidence Based Teaching. Thanks to Holly Welham for organising it. Here are some of the comments:

 

1. What works best and why

 

“Nearly everything suggested in education can claim to be ‘evidence-based’ because, in research, nearly everything works!
The trick is to ask the next question: “how well does it work relative to everything else?”
 When we ask this one we find that the debate is dominated by methods which work, but only have a tiny effect. While much more effective things, like ‘linking to prior knowledge’ are little used.”

 

“It’s not just what works, but what works best; not just what works best, but why.”

 

“The greatest contribution science can make is in explaining why some teaching methods work much better than others.”


 

2. Research into Practice

 

“The teaching profession needs an evidence-informed, scientific approach to what really works in the classroom. We need to become a more critical profession, challenging the status quo.”

 

“It’s great that funding is now available for new research, but we need to be careful not to feel we need to ‘sit back and wait’. While there is lots to learn, there is also lots which we already know. It would probably take a whole teaching career to become proficient in the methods we already know work well.”

 CEMgraph

“Almost no teacher has the time to read research papers – and they should avoid doing so (unless they have piles of time) – because they can only read a few and cannot get the general picture from all the evidence.
That’s why meta-studies are so useful for teachers.”

 

“Trialling things on a large scale and communicating the results means that not all teachers have to invent things for themselves. Practitioners can use the toolkit as a starting point for decision making so they don’t need to trawl through the original research.”

 

“Ask teachers what the problems are;
 ask them what they want to aspire to;
give them the research evidence in that direction; 
let them choose what research to use to adapt; 
let them experiment with this whilst talking with colleagues about how their experiment is going.”

 

3. School Leaders

 

“The first thing leaders should do is to introduce their staff to the existing knowledge – what we already know works well. Then teachers need to try out the proven methods in their own teaching. The research shows that teachers will need to practice with a method for at least 6 months and share ideas.”

 

“A theme seems to be that we already know what works well in education. 
From a teacher’s viewpoint, this is the research-practice paradox: though we might already know what works well from the research, in practice school leadership aren’t even very good at avoiding what doesn’t work. 
For instance, research suggests graded observations are unreliable and inconsistent; yet many schools persist in using them for performance management, and are now locking grading in to PRP (performance-related pay) policies.”

 

“Teachers – would be great to hear any suggestions for how organisations like NFER can better support you in finding relevant research and putting it into practice.”

 

“The scary thing is – there seem to be a dozen organisations who claim to be making evidence available to teachers, but which few teachers have even heard of. Several make the mistake of linking teachers with research papers, rather then with the big-picture, the 10-20 things which work best, the 10 most common myths of low-effect methods or policies. The most helpful thing which NFER could do would be to join with all the others and become teacher-centered: a National Institute of Clinical Excellence for education. The fact that this hasn’t happened is a concern.
If the funds which maintain these organisations were directed to communicate the big-evidence-picture to classroom teachers, the job would be done in a couple of years.
EBTN would then be redundant.”


 

“We need the NFER to support, fund and publish practical summaries from emerging teacher-led initiatives like ResearchEd and the Touchpaper Problems”.

 

The way forward

 

What’s the way out of the research-practice paradox? One of the most promising avenues is the independent convergence of a century of scientific advances in cognitive psychology with decades of statistical meta-analysis – to clarify once and for all what works best in teaching.

 

Next weekend at the Midlands ResearchEd, Daisy Christodoulou, Katie Ashford, Kris Boulton and I are speaking about applying the insights from cognitive science to classroom practice.

 

Image

It’s time for teachers to set the research agenda.

Advertisements

About Joe Kirby

English teacher, education blogger
This entry was posted in System. Bookmark the permalink.

15 Responses to The research-practice paradox

  1. The problem is further back though isn’t it – teachers are trained to be technicians, they are not given the proper tools to be researchers or analysers of research. And I would suggest that what teachers are trained to do is mostly not based on sound research evidence, it’s whatever suits the political agenda at the time. Educational research tends to be quite disjointed too, it is not cumulative in the way that medical research is, and it is not conducted by the practitioners of it in the same way as medicine is. I feel that to look at teachers as being the ones to ensure that classroom practice is evidence-based is to look in the wrong place; it’s a sticking plaster approach.

    Sorry if I haven’t articulated that very well, I haven’t had my first cup of coffee yet and I’m rushing because my kids are calling me…

    • S Horsch says:

      I agree. Analysis of student product is a weakness of educator training. It is a painful irony, to me at least, that as educators we evaluate work but can’t be given or struggle with determining that a pile of evidence, hopefully student work, has demonstrated a promising new pedagogical idea.

      My best guess as to why this might be is that we are alone too often making decisions while evaluating work that we were never actually trained to make and too embarrassed (?) to realize that we may have made mistakes in student preparation and planning. The curriculum then forces us to move on and and not address these errors. I am wondering if curriculum, except in its most vague form, is nonsense.

      Exacerbating this issue is that too often new initiatives are linked with the promotion of leaders and would be leaders. This is a grave problem when no evidence of effectiveness is measured. It is part of the “culture of nonsense” in education.

      But then again-what do I know?

  2. Reblogged this on cmachadolearns and commented:
    I have mixed feelings about this one. Should we be providing teachers with proven best practice research or should we be teaching them to be curious and think like scholars?

  3. 3rsplus says:

    The widely held conceit of the educational research community is that the reports of their inquiry (wishfully termed “evidence”) can directly affect classroom practice–for the better. (What the “better” is, isn’t specified, which is part of the problem).

    The belief has been promoted to teachers and citizenry as “evidence-based decision making”, despite the replicated evidence that the belief “doesn’t work.” The research community views the failure as an absence of “implementation fidelity” on the part of “teachers” and anyone else involved–anyone but the researchers, who have gone on researching other matters that will yield publishable papers.

    Teachers, in turn, attribute any instructional failures to the students, their parents, or “society.”–to anything but instruction.

    There was a time between the late 1990s and the earlier 1960’s when educational research (largely psychological) and classroom practice in England and the US were much more directly coupled, but that’s a whole nother story; that was then and this is now..

    So what should we do? Well, first first look at “evidence.” While I wholeheartedly endorse “cognitive science,” the Hattie-Yates book, “Visible Learning and the Science of How We Learn,” the ResearchEd initiative, and the upcoming Midlands seminar, the evidence indicates that it’s unrealistic to expect to “clarify once and for all what works best in teaching.” Science and technology inherently preclude that possibility. Neither is it reasonable to expect “teachers to set the research agenda” partly for what Vanessa-Jane Chapman says in her comment, and for other reasons. It hasn’t happened in the past, and it’s not in the cards for the future.

    The best bet in my view is to pursue Natural Educational Experiments and I’ve described one such that provides a prototype of the methodology involved:
    http://ssrn.com/abstract=2356004
    In this methodology the “research” is the practice, so the “paradox” is resolved. Teachers are inherently principal players in the research, but it’s instruction, not teachers who are “accountable.” “What works” is transparent and replicatble without relying on fancy statistics which only a few researchers understand.

    There may well be other operational avenues to pursue, I haven’t heard of them, but others will certainly open up in the future.

  4. liam.greenslade@gmail.com says:

    Hi darling

    Thought you might find this interesting

    Lx

    Sent from Surface

  5. Gary Jones says:

    Hi Joe

    This is a super post, which I have really enjoyed reading.

    In my current role as Interim Head of a FE college we have undertaken a large-scale action research project to consider ways of improving vocational pedagogy. Whilst the design of the project is evidence based, particularly in the way it draws upon ‘cutting-edge’ theory and the support for colleagues taking part, I’m not sure we are going to get evidence based outcomes from the 80 or so action research report currently being written by colleagues.

    PS
    We gave up/never started on graded lesson observations as never thought they were reliable or valid, and have used a coaching model which focuses on teacher development

  6. Jackdaw LTC says:

    Hi Joe,
    Thanks very much for this stimulating post.
    I agree that practitioners lack time to study research findings. What’s the best way forward? Should they engage in research themselves perhaps, or read meta-surveys? I would say there’s room for teacher educators to make a difference here.
    For example, I conduct professional development for secondary teachers in Hong Kong and I see part of my role as an intermediary between research and practice. I endeavor to make research findings easily digestible and help teachers to reflect on application of research findings bearing in mind their learning environments.
    The “technical skills” of teaching that Vanessa-Jane mentions I still see as vital, though. Without those skills, I fear that implementation of strategies that are reported to have a high impact on learning, e.g. providing feedback to learners, will be undermined. This is a phenomenon that I have witnessed in higher education where professors of all disciplines are encouraged to be “scholarly” about their teaching, but who lack basic classroom management skills.
    Thanks again for a great post!
    Peter

  7. Pingback: A guide to this blog | Pragmatic Education

  8. Pingback: Lesson Study so far….. | Meols Cop High School

  9. suzrn says:

    Hi Joe, thank you for a very interesting and informing post. I would like to offer another perspective as a Registered Nurse who is trying to instill evidence-based research protocols in my large, urban, pediatric hospital in California.

    When you said, “The first thing leaders should do is to introduce their staff to the existing knowledge – what we already know works well. Then teachers need to try out the proven methods in their own teaching. The research shows that teachers will need to practice with a method for at least 6 months and share ideas.” I couldn’t agree more with that statement. My teaching environment in my hospital is a living testament of this comment as we are in the midst of teaching a new evidence-based practice model to our nursing staff. We have selected the John Hopkins Nursing Evidence-Based Practice Model, which is a problem-solving approach to clinical decision making and is accompanied by user friendly tools to guide individual and group use. We are actually piloting the project currently with 50 new graduate nurses in our facility. It has been met with some mixed reactions and the naysayers who are always opposed to change are resisting our efforts. However, with persistence and continued use of this model over time, it will be the way our hospital “does business” and will become the standard for using evidence-based research in our nursing practice. We have developed a core group of nurses (about 50) who have taken on this challenge to be the mentors and developers throughout our staff. You talked about a tool kit in your post and we have found this enormously helpful in our education roll out of this model.

    Thanks for an informative blog; it is amazing how when you give someone the tools and the encouragement, how much passion you can ignite in them. Some of our nurses have wonderful ideas of how they want to change practice and now they feel empowered in how to do it.

  10. Pingback: ‘Teachers hit out at teacher training’… or do they? | Pragmatic Education

  11. Pingback: ‘Teachers hit out at teacher training’… or do they? | Pragmatic Education

  12. Pingback: The Signal & The Noise: The Blogosphere in 2014 | Pragmatic Education

  13. David Macfarlane says:

    During my ITT I have found that time is spent looking at the theory, with elements of good practice but then we are sent out on placement with no real idea of the full picture or practice prior to then having to plan sequences of lessons and then teaching this to a class of primary school children. Surely there is a need for ITT to ensure that students going out on placement have the required tools to full what is required of them on said placements, because at present this is not proving the case in my experience.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s