Reflections on Research ED 2013

Yesterday I spent the day at Dulwich College for researchED2013. Although the various sessions covered a huge amount of ground, a recurrent theme was the need to build stronger links between those carrying out research in education and the teaching community. By simply bringing together these two groups together for the day the conference itself was an admirable first step in that process.

The desirability of this process is unquestionable. Our education system and the students passing through it would be better served if the boundary between the research establishment and the teaching profession was more porous. One of the things that I found most useful about the conference was that it highlighted some of the obstacles to making this happen. From the research side lots of what makes it to publication is, to be blunt, pretty rubbish. It can often only be accessed by subscription and is just as prone to impenetrable jargon and obtuseness as any other academic discipline. For teachers, a lack of time to engage with the literature is often combined with (understandable) scepticism about claims of the latest evidence-based practice.

Discussion also focussed on previous attempts to bridge the gap.  Particular attention was given to the EEF’s Teaching and Learning Toolkit which is designed to be an accessible summary of education research for helping schools allocate resources, particularly the pupil premium. The toolkit is probably the most impressive attempt so far to promote the use of research in schools. It allows schools to compare the effectiveness and cost of a range of different approaches.  The graph below gives an overview of how the toolkit rates each approach.

Image

By using this graph and the more detailed information provided in the toolkit schools can very quickly identify some of the more effective ways of allocating resources to improve attainment. This is clearly preferable to making decisions based on no evidence at all. However, it is still a long way from truly aligning research with practice. Aside from the obvious methodological problems with comparing different interventions with varying evidence-bases on a single scale it is inevitable that in translating complex research into such a user-friendly format lots of important detail gets lost along the way. To use the common example of Assessment for Learning (AfL), it would be almost impossible to find a school that does not claim to be implementing AfL. Yet I would be willing to bet that no more than 1 in 10 are getting anywhere near the gains that its position on the top left of the graph suggests they should be. Indeed, if you were to add error bars to the graph to account for the range of effects depending on how each approach was implemented (or even how difficult each approach is to implement well) you would end up with significant overlap between them.

None of this is intended to deny the value of things like the toolkit. It does, however, highlight a potential tension in some of the discussions that took place at the conference. If we want what happens in the classroom to be informed by the research we need to take account of all of the nuances and caveats that it contains. Yet many of the suggested solutions – such as Ben Goldacre’s journal club – involved converting research findings into bite-sized chunks that can easily be implemented by schools and teachers. Inevitably in this process much of the important detail is going to get lost, undermining the ultimate objective of research-informed practice.

I was incredibly impressed by the dedication of those who attended to this objective. But to make this happen we need to scale up our ambition to match our rhetoric. Settling for basing practice on bullet-point summaries and star ratings is not good enough. We need to aim for much deeper integration between the research community and schools. Twenty years from now why could we not be in a position where in every school there is a senior member of staff from an academic background solely responsible for ensuring that classroom practice is not just evidence-informed but evidence-led?

Advertisements
This entry was posted in Uncategorized. Bookmark the permalink.

3 Responses to Reflections on Research ED 2013

  1. Pingback: #rED2013 First thoughts | Teaching Science

  2. Pingback: ResearchED 2013: a Reading List | researchED 2013

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s