Educational practice is currently informed by many different research approaches; qualitative, quantitative, short term, longitudinal, lab-based, school based, teacher led and researcher led to name a few. All these approaches should continue to have a place in the evolution of education policy.

Randomised Controlled Trials (RCTs) also have a place within the myriad of tools available to inform education policy. Their strength is in evaluating potential interventions before widespread roll-out. The hope is that – as in the medical sphere – RCTs will test the effectiveness of specific approaches on specific cohorts and will facilitate the rigorous evolution of educational practice to the benefit of all involved. With reference to an example, this article provides a closer look at how RCTs can be successfully implemented in schools and how the results can benefit researchers, teachers and learners.

The STAR (Speechreading Training and Reading) Project, funded by the Wellcome Trust and led by Dr. Mairead MacSweeney at University College London, is an RCT testing whether lipreading training increases the phonetic awareness and reading skills of deaf learners aged 5-7. The lipreading training is delivered through a fun online game that the learners use for 10 minutes a day, four days a week, for 12 weeks and has a well-matched maths control training experience.

RCTs are not a replacement for other types of research, rather an extension of them. With STAR, both the lipreading and maths training draw insights from previous research on the development of these skills. Among other benefits, the STAR study is assimilating findings from lab-based research and translating them into a complete front line intervention.

With STAR, the delivery mechanism – an online game – is integral to the design and success of the RCT. The adaptive training game is accessed by learners from their own classroom in such a way that requires minimal teacher support and minimises disruption to the classroom. Importantly, the online element ensures learners complete the training in an ecologically valid setting, while the researchers are still provided with instantaneous and detailed performance data.

If the training supports deaf children’s early reading development, there are options for how researchers and teachers proceed.

Researchers could adapt the games to answer more specific questions. What are the characteristics (age, language level, attention skills) of those learners that benefit from the training? What game mechanics (i.e. type of feedback) maximally improve performance? Does choice of vocabulary impact performance? Does lip-reading training also help dyslexic learners with phoneme discrimination? Answers to these questions can further inform targeting of the intervention, classroom practice and future lab-based research.
The research tool – the STAR game – does not prescribe how teachers help deaf students learn to read. Knowing that lipreading training can help deaf learners empowers teachers to find their own solutions that support their students if they wish.

On the other hand, online games can provide a cost effective mechanism of delivering specialist and effective training to specific cohorts of learners in their classroom. Historically, educational software has not been scrutinised to the same extent as teaching or medical interventions to ensure it achieves its desired aims. One place that RCTs and pilot studies can be of significant value in education is in demonstrating the value of educational software, and other learning aids, before they are procured by schools. Ultimately, policy makers, teachers, parents and learners should be confident that limited budgets are only spent on interventions that have been shown to be effective. The process of bringing a new pharmaceutical product to market may provide a useful framework for bringing interventions from the lab to the general public.

Jo Evershed, CEO, Cauldron.sc
Dr Hannah Pimperton, Research Associate, UCL