RCT evaluates the Maximising the Impact of Teaching Assistants programme
In the first randomised controlled trial (RCT) studying the efficacy of a whole-school intervention aimed at improving how schools deploy TAs in everyday classrooms, researchers found MITA had an effect on TA deployment, preparation, and interactions, and on pupil engagement. However, this did not translate into an impact on pupil attainment.
What is the issue?
Maximising the Impact of Teaching Assistants (MITA) is a programme that aims to develop the ways in which teaching assistants (TAs) can improve students’ outcomes by working in a whole class setting. It provides training, at different times, for all staff in the school, regarding the role and potential impact of TAs.
Some prior research found that lower-achieving pupils receiving the most support from TAs made less progress than similar pupils receiving less TA support. The research found that, after controlling for pupil characteristics, it was the decisions made by school leaders and teachers about TAs, and not TAs’ own actions, that best explained the findings. The findings suggest that deploying, training and preparing TAs more effectively could reverse this effect.
Recent trials funded by organisations such as the Education Endowment Foundation (EEF) have also shown that TAs can have a positive impact on outcomes if they are deployed to deliver structured interventions. However, there are no evaluation programmes that help schools use teaching assistants in other ways, such as in whole class settings.
How did we help?
RAND Europe, in partnership with the University of Cambridge, was commissioned by the EEF to conduct an independent evaluation of MITA. The trial evaluated the impact of this intervention using reading attainment for pupils in Year 3 and Year 6 as a primary outcome. Pupils’ maths attainment and engagement with learning as well as changes in TA deployment and preparation were analysed as secondary outcomes.
The researchers analysed data from 124 schools that were allocated to either receive MITA or to act as a control group. Mixed methods implementation and process evaluation was undertaken to capture compliance with the programme’s main activities, explore barriers and facilitators, and understand practice in control schools. The implementation and process evaluation included observations of the input for school leaders, staff training and consultancy visits in individual schools. This continued into the second year to see if intervention schools were able to maintain their new practices in the absence of further support.
What did we find?
- There is no evidence that MITA has an impact on reading outcomes for pupils in Year 3 and 6. This result has a high security rating.
- MITA has a moderate positive impact on pupil engagement. Pupils in MITA schools were more engaged than pupils in control schools, however the analysis used a smaller number of schools with several schools unable to complete the measure, which limits the security of these findings.
- There is evidence that staff in MITA schools changed their behaviour in line with MITA principles, based on a measure of change in practice when compared to control schools. Although evidence is limited by a small sample and the use of a new measure of change in practice that has not been tested more widely, behaviour change is supported by evidence from the teacher and TA surveys, interviews, and classroom observations.
- During the trial, control schools made substantial efforts to improve TA deployment in line with many of MITA’s key recommendations. However, an analysis of behaviour from teacher and TA surveys between the start and end of the trial suggest that this did not translate into changes in behaviour.
- Interviews in case study schools indicate that senior leadership and staff buy-in are fundamental for effective implementation of MITA. Staff turn-over at the senior leadership and classroom level are potential barriers to embedding MITA principles in the longer term.