The Literacy Octopus: Communicating and Engaging with Research

The two ‘Literacy Octopus’ trials - named after their multi-armed design - drew on a wide range of evidence-based resources and events designed to support the teaching and learning of literacy at Key Stage 2. They were delivered by four partners with extensive expertise in education research, and included printed and online research summaries, evidence-based practice guides, webinars, face-to-face CPD events, and access to online tools. The trials were funded jointly with the Department for Education and the Mayor’s London Schools Excellence Fund.

local_library

Literacy

Subject

accessibility

Key Stage 2

Key stage

EEF Summary

Translating evidence-based resources into real change in the classroom is difficult. Simply sending resources to schools is often thought to have no effect, even when those resources are high quality and evidence-based. However, there have been specific examples where putting hard copies of practical guides in the hands of professionals appears to have made a difference to practice, and at minimal cost. So the first trial tested whether sending schools high quality evidence-based resources in a range of different formats could have an impact on pupil outcomes.

The second trial tested whether combining the provision of resources with light-touch support on how to use them would have greater impact. Some schools were simply sent evidence-based resources, while others received the resources along with simple additional support, such as invites to seminars on applying the resources in the classroom. As well as pupil outcomes, this trial also measured teachers’ use of research, to help us understand the impact on teacher behaviour.

Neither trial found evidence of improved literacy attainment at Key Stage 2 and the second trial found no increase in teachers' use of research. These high quality trials tested a wide range of interventions in a large number of schools. The results suggest that, in general, light-touch interventions and resources alone are unlikely to make a difference.

This has important implications for organisations which are using evidence to improve teaching and learning. At EEF we have already started to take a more intensive approach to scale-up, with more structured support for schools and a greater focus on implementation. Our campaigns and Research Schools network are good examples of this. Importantly, we will continue to evaluate the impact of this activity.

The interventions

The first trial tested the following interventions. The Research Results table below shows the results for each.

  • 1A ‘Improving Reading’ guide - Evidence-based printed booklet Improving Reading: A Guide for Teachers. From CEM, part of Durham University.
  • 2A Evidence updates & website - Regular printed and electronic materials, including the Better Evidence-based Education magazine and the ‘Best Evidence in Brief’ email, and access to a searchable database, Evidence 4 Impact. From IEE at the University of York.
  • 3A Webinar - A link to the archived webinar and materials from a conference on research evidence relating to KS2 literacy. From ResearchEd, in collaboration with NatCen.
  • 4A Teaching How2s website - Access to the Teaching How2s website, including evidence-based visual guides on CPD. From Campaign for Learning/Train Visual.

The second trial tested the following interventions. The Research Results table below shows the results for each.

  • 1B Evidence updates & website - Regular printed and electronic materials, including the Better Evidence-based Education magazine and the ‘Best Evidence in Brief’ email, and access to a searchable database, Evidence 4 Impact. From IEE at the University of York.
  • 2B Evidence updates & website + event - Intervention 1B plus invitation to an evidence-based literacy programmes event.
  • 3B Teaching How2s website - Access to Teaching How2s website, including evidence-based visual guides on CPD. From Campaign for Learning/Train Visual
  • 4B Teaching How2s website + intro event - Intervention 3B plus invitation to a one-day introduction to using the Teaching How2s website, and updates on using the visual guides.
  • 5B ‘Improving Reading’ guide & emails - Hard copy evidence-based KS2 literacy teaching materials, including evidence-based printed booklet Improving Reading: A Guide for Teachers and monthly classroom activity posters. From CEM, part of Durham University.
  • 6B ‘Improving Reading’ & emails + CPD session - Intervention 5B plus invitation to one twilight CPD session.
  • 7B ‘Improving Reading’ & emails + CPD sessions & tools - Intervention 6B plus invitation to one further twilight CPD session, use of pupil diagnostic tools, and teacher peer observation between sessions.
  • 8B Conference - Invitation to free Saturday conference on research evidence relating to Key Stage 2 literacy. From ResearchEd, in collaboration with NatCen.
  • 9B Conference + webinars - Intervention 8B plus invitation to two webinars to provide support on applying the research from the conference in schools.

Research Results

1A ‘Improving Reading’ guide

0
Months' Progress
Evidence Strength

2A Evidence updates & website

0
Months' Progress
Evidence Strength

3A Webinar

0
Months' Progress
Evidence Strength

4A Teaching How2s website

0
Months' Progress
Evidence Strength

1A ‘Improving Reading’ guide, FSM

0
Months' Progress

N/A

2A Evidence updates & website, FSM

0
Months' Progress

N/A

3A Webinar, FSM

0
Months' Progress

N/A

4A Teaching How2s website, FSM

0
Months' Progress

N/A

1B Evidence updates & website

0
Months' Progress
Evidence Strength

2B Evidence updates & website + event

0
Months' Progress
Evidence Strength

3B Teaching How2s website

0
Months' Progress
Evidence Strength

4B Teaching How2s website + intro event

0
Months' Progress
Evidence Strength

5B ‘Improving Reading’ guide & emails

0
Months' Progress
Evidence Strength

6B ‘Improving Reading’ & emails + CPD session

0
Months' Progress
Evidence Strength

7B ‘Improving Reading’ & emails + CPD sessions & tools

0
Months' Progress
Evidence Strength

8B Conference

0
Months' Progress
Evidence Strength

9B Conference + webinars

0
Months' Progress
Evidence Strength

1B Evidence updates & website, FSM

0
Months' Progress

N/A

2B Evidence updates & website + event, FSM

-1
Months' Progress

N/A

3B Teaching How2s website, FSM

-1
Months' Progress

N/A

4B Teaching How2s website + intro event, FSM

0
Months' Progress

N/A

5B ‘Improving Reading’ guide & emails, FSM

0
Months' Progress

N/A

6B ‘Improving Reading’ & emails + CPD session, FSM

0
Months' Progress

N/A

7B ‘Improving Reading’ & emails + CPD sessions & tools, FSM

+1
Months' Progress

N/A

8B Conference, FSM

0
Months' Progress

N/A

9B Conference + webinars, FSM

0
Months' Progress

N/A

Were the schools in the trial similar to my school?

There were 823 schools in the first trial, and 12,500 in the second trial. 

Because the samples were very large, they were broadly representative of schools across England. For example, the proportion of children who had ever been eligible for Free School Meals was 29% in both trials, the average for primary schools nationally. 

Could I implement this in my school?

Several of the interventions from this trial are available from the delivery partners:

account_circle

SLT, Teachers, TAs

Delivered by

language

Whole School

Participant group

date_range

5 Terms

Intervention length

How much will it cost?

All interventions were very low cost - an average of around £3 per pupil per year over three years. Teacher time was not measured, but was a challenge for some of the schools involved, both in terms of initial engagement and time to implement evidence-based strategies. 

£

£3

Cost per pupil

people_outline

Variable

No. of Teachers/TAs

today

Variable

Training time per staff member

Evaluation info

Schools

13,000

Pupils

-

Key Stage

Key Stage 2

Start date

June 2014

End date

December 2017

Type of trial

Effectiveness Trial

Evaluation Conclusions

  1. The first trial found no evidence that Literacy Octopus passive dissemination interventions improved pupils’ Key Stage 2 English scores compared with the control group. The five padlock security rating means we have very high confidence in this result. 

  2. These findings suggest that simply disseminating research summaries and evidence-based resources to schools is not an effective way for research organisations to support schools to improve pupil outcomes. 

  3. It is likely that these materials formed a small part of the information received by schools during this time. It is possible that if schools had support to navigate and prioritise that information, greater impact could be achieved. Alternatively, schools may need more support in transforming such materials into actual change in the classroom. 

  4. The evaluation team have analysed the Key Stage 2 results from a further year following the project to explore if there are longer-term effects of the interventions. These results are published in addenda to the main reports. 

  5. The second trial found no evidence that any of the interventions improved pupils’ Key Stage 2 English scores. The five padlock security rating means we have high confidence in this result. 

  6. There was no evidence of impact on any of the six teacher Research Use Measures used in this trial. However, we have limited confidence in this result given the low response rate to the questionnaires designed to capture these outcomes, and some measures were only moderately reliable. 

  7. Schools’ level of engagement varied: six out of ten schools did not engage to the level expected by the providers, although a small proportion engaged to a greater extent than expected (for example by hosting CPD sessions). Reasons for not engaging included lack of time, the timing and location of events, and a preference for face-to-face support rather than online or remote formats only.

  8. Teachers felt research evidence was most effectively communicated when it was interactive, accessible, relevant, included a balanced and credible discussion of the evidence, and focused on how to apply the evidence in practice. Where schools went on to implement changes in light of the interventions, these came about through mechanisms such as in-school collaboration, further enquiry, and trying out, reviewing, adapting, and embedding approaches. 

  9. The lack of impact across the different interventions suggests that simply communicating research evidence to schools is not enough to improve outcomes. How easily the presented evidence can be used in practice—and the conditions in schools for implementing evidence-based change—might be just as important. Further research should assess whether interventions can transform evidence into practical action, and develop supportive implementation conditions in schools.


  1. Updated: 23rd October, 2019

    Printable project summary

    1 MB pdf - EEF-the-literacy-octopus-communicating-and-engaging-with-research.pdf

Full project description

keyboard_arrow_down