Engagement: Just because they’re busy, doesn’t mean they’re learning anything.

I’ve long thought that one of the weakest proxy indicators of effective learning is engagement, and yet it’s a term persistently used by school leaders (and some researchers) as one of the most important measures of quality. In fact many of the things we’ve traditionally associated with effective teachers may not be indicative of students actually learning anything at all.

At the #ascl2015 conference last Friday, the always engaging Professor Rob Coe gave a talk entitled ‘From Evidence to Great Teaching’ and reiterated this claim. Take the following slide – How many ‘outstanding’ lessons have been awarded so based on this checklist?

Screen Shot 2015-03-21 at 21.15.21

Prof. Rob Coe From Evidence to Great Teaching ASCL 20 Mar 2015

Now these all seem like key elements of a successful classroom, so what’s the problem? and more specifically, why is engagement is such a poor proxy indicator – surely the busier they are, the more they are learning?

This paradox is explored by Graham Nuthall in his book ‘The Hidden Lives of Learners,’ (2007) in which he writes:

“Our research shows that students can be busiest and most involved with material they already know. In most of the classrooms we have studied, each student already knows about 40-50% of what the teacher is teaching.” p.24

Nuthall’s work shows that students are far more likely to get stuck into tasks they’re comfortable with and already know how to do as opposed to the more uncomfortable enterprise of grappling with uncertainty and indeterminate tasks. A good example of this as Alex Quigley has pointed out is that engagement in the form of the seemingly visible activity of highlighting is often “little more than colouring in.” Furthermore, teachers are more than happy to sanction that kind of stuff in the name of fulfilling that all important ‘engagement’ proxy indicator so prevalent in lesson observation forms.

The other difficulty is the now constant exhortation for students to be ‘motivated’ (often at the expense of subject knowledge and depth) but motivation in itself is not enough. Nuthall writes that:

“Learning requires motivation, but motivation does not necessarily lead to learning.”p.35

Motivation and engagement and vital elements in learning but it seems to be what they are used in conjunction with that determines impact. It is right to be motivating students but motivated to do what? If they are being motivated to do the types of tasks they already know how to do or focus on the mere performing of superficial tasks at the expense of the assimilation of complex knowledge then the whole enterprise may be a waste of time.

Learning is in many cases invisible as outlined many times by David Didau and is certainly not linear but rather more nebulous in actuality. As Prof. Coe reminds us, ‘learning happens when people have to think hard’ but unfortunately there is no easy way of measuring this, so what does he suggest is effective in terms of evidencing quality?

Ultimately he argues that it comes down to a more nuanced set of practitioner/student skills, habits and conditions that are very difficult to observe, never mind measure. Things like “selecting, integrating, orchestrating, adapting, monitoring, responding” and which are contingent on context, history, personalities, relationships” and which all work together to create impact and initiate effective learning. So while engagement and motivation are important elements in learning they should be seen as part of a far more complex conglomerate of factors that traditional lesson observations have little hope of finding in a 20 min drive-by.

This is where a more robust climate of research and reflective practice can inform judgements. It’s true that more time for teachers to be critically reflective will improve judgements but we also need to be more explicit in precisely what it is we are looking for and accept that often the most apparent classroom element may also be the most misleading.

Slides: Prof. Rob Coe:  From Evidence to Great Teaching ASCL 20 Mar 2015

Nuthall, Graham (2007). The Hidden Lives of Learners. Wellington: New Zealand Council for Educational Research Press

Education Research: Cognitive Psychology Can’t Be The Only Game in Town

As head of research, this past year I have spent a huge amount of time reading papers and increasingly coming up against terms like “interrater Reliability” or “Box-and-whisker plot.” (The latter sounds like some sort of racy cat based detective novel.) The majority of papers I seem to be reading are from the field of cognitive psychology and whilst they provide fascinating insights into the workings of the brain and have deeply enhanced my understanding of how we actually interpret sensory data, I feel we are losing sight of something.


 “8 out of 10 education researchers prefer whiskers.”

Despite doing a course in statistical methods in my first year of my PhD in education, I’m often left cold. Whilst I can work with the abstracts and conclusions, I find I often struggle with the very methodological terms used to justify the claims made. Someone recently told me that to work in education research you should ideally have a degree in cognitive psychology and statistical methods. My response was that unless he has read Homer, Plato, Socrates, Shakespeare or Locke then he shouldn’t be allowed near a classroom. (Unreasonable I know, but it sounded good at the time.)

A key element of education research is about representation. You are attempting to represent a process (and an unfathomably complex one at that) and then test particular approaches or observe specific phenomena. Using solely an empirical method to represent and describe this deeply complex relational phenomenon can seem akin to “measuring a transistor to make sense of a joke in a YouTube video.” (Eagleman)

In the education research arena, I find myself more and more listening to people who are not so much talking about this complex process but rather lecturing us on how they have simply measured a transistor. I’m always reminded of Chris Morris tricking Noel Edmonds on Brass Eye into telling us that the “made-up” drug “cake” affects a part of the brain known as “Shatner’s Bassoon.”

It is great that there have been so many advances in our understanding of how the brain works and the relationship to student learning, but there sometimes seems to be an absence of discourse about to what end this information is useful or how it exactly it empowers children. There are some fantastic practitioners in cognitive psychology such as Nick Rose and Mark Healy who take findings from the field and then apply it insightful and erudite ways informed by other disciplines, but it feels that in the case of many others we are becoming literally “brainwashed.”

Why is this field so dominant now? Is it commensurate with a school culture that seems to audit itself solely now in terms of quantitative ‘measurable’ data? There is much more to be said on this and I just wanted to briefly put down some thoughts here but to my mind there are four areas in education research: philosophy, anthropology, sociology and of course psychology and at the moment I worry that we seem to have hedged all our bets on the latter.

The scourge of motivational posters and the problem with pop psychology in the classroom

Fifteen years ago I watched David Brent give this masterclass in motivation. This was before I started teaching, and when I entered the profession I was horrified to learn that this kind of stuff appeared to be embedded in so much of education from the Monday morning assembly to the top-down CPD session. I remember attending a leadership training day that featured one bit that was almost word for word, a carbon copy of the hotel role-play scene where Brent ‘fazes’ the trainer.

Nowhere is this pseudo-profundity more alive today than in social media, and the weapon of choice for this kind of stuff is the motivational poster. More than ever, we seem to be drowning under a tidal wave of guff exhorting both pupil and teacher to ‘reach for the stars’ and ‘be all that you can be.’ While seemingly benign and well intentioned, these missives in mediocrity signal a larger shift towards the trivial and sit alongside a set of approaches that may well be doing more harm than good.

Carol Dweck’s work on Growth Mindsets is often mentioned in relation to interventions aimed at shifting student self perception but like a lot of promising areas, the transition from research to practice is often a dysfunctional one. The hallways of many schools are now festooned with the obligatory mindset motivational posters and “failure walls” (Always wondered about these, they’re like a 12 step recovery programme with 11 steps missing) with whole school assemblies exhorting kids to embrace failure and choose a more positive mindset, often reductively misrepresented as “you can achieve anything if you believe.”

Screen Shot 2015-02-15 at 11.31.48

This type of stuff is obviously well intentioned but beyond symbolising a culture that privileges the media-soundbite over critical reflection, it does I think signify an increasing shift towards psychological interventions aimed at changing student self perception and represents a somewhat base and quite reductive approach to an extremely complex set of issues. Done well, certain interventions can be highly effective as in the case of coaching or the aforementioned promising field of Growth Mindsets. However, done poorly they can be not only confusing for students, but can take up valuable time and resources for things that might actually improve student self perception in a far more powerful way. On a more serious level, Nick Rose has written about the worrying rise of soft psychotherapy in schools and warns that these interventions may be poor substitutions for woefully inadequate mental health provision for children.

There are two central issues with these generalised attempts at trying to manipulate student’s perception of themselves. Firstly, student self-concept is both multi-dimensional and hierarchal. (Marsh et al.,1983; Muijs 1997) A student might have a very positive concept of self in English but a very negative one in Maths. Secondly student self concept is both academic and non-academic and can be broadly categorised into seven subareas such as physical ability/appearance and peer relations as well as academic ability (Shavelson, 1986.) So tying to manipulate these domain specific issues through ‘all-purpose’ positive interventions attempting to boost general self esteem are likely to be ineffective.

The other major issue here is that we may have got things back to front. Research shows that while there is a strong correlation between self perception and achievement, the actual effect of achievement on self perception is stronger than the other way round (Guay, Marsh and Boivin, 2003.) It may well be the case that using time and resources to improve student academic achievement may well be a better agent of psychological change than psychological interventions themselves. Daniel Muijs and David Reynolds (2011) note that:

At the end of the day, the research reviewed shows that the effect of achievement on self-concept is stronger that the effect of self-concept on achievement.

So there is a strong case to say that that focusing our efforts on students being taught well (surprise, surprise) and given clear and achievable paths to academic success creates a more positive perception of themselves anyway than those given unproven interventions such as the kind of pop psychology churned out in so much of school life. A key question then is why is so much time and energy invested in it?

One of the best initiatives I’ve seen is from a school in New York where they use blocked periods of time in the school week called ‘Lab Time’ where both teachers and pupils were free and where the onus was on the students to book appointments with particular teachers and go over work they had missed or didn’t understand or just needed to improve on. This gave pupils a real sense of agency, responsibility and choice and a series of opportunities to address their own problems. How much time do we waste on assemblies, tutorials and numerous interventions that are costly, time-intensive and ultimately ineffective? Would an approach like this not only give pupils more chance of improving academic achievement but also concomitantly, their own self-perception?


Motivational posters are a a “daily boost of inspiration” for some and vomit-inducing for the rest of us but they also encourage us to take complex ideas and reduce them to something utterly trivial, and seemingly life-changing and often far removed from their original premise. There are complex ideas that should be given time and space for us to critically reflect upon and resist the urge to summarise into a soundbite. Education research in particular shouldn’t be represented as some kind of ersatz profundity summarised in a single sentence, it should embrace Keats’s notion of negative capability and seeking a richer, more complex and ultimately elegant elucidation of these difficult ideas that we hope will improve student experience.

As my old English literature tutor Prof. Chris Baldick once quipped in a lecture “Men are from Mars, women are from Venus and pop psychology is from Uranus.”

Why has practitioner research had such little impact in schools?

One thing that has always baffled me is how school leaders have marginalised staff involved in research or practitioner inquiry. If a teacher wants to do an MA or PhD in an area related to their own professional development, they are often given little or no financial or time support. Certainly research has not been a central part of the mission of being a classroom teacher, it has in effect been seen as an expensive hobby.

Many teachers I’ve spoken to have essentially felt like rogue agents, “pale students of unhallowed arts” wielding dangerous knowledge. Their work is not aligned with a whole school focus and very little of it is even linked with their own professional development.

Screen Shot 2015-01-24 at 07.47.52

How many senior leadership meetings features the phrase “What does the research say?” or even taken the position that it might be something useful? In my experience many younger staff who want to do research are not sure exactly what it is they want to research but just want to improve and be more reflective about their practice. Why aren’t school leaders harnessing that kind of enthusiasm towards whole school improvement?

Screen Shot 2015-01-24 at 07.47.42

Whole school research focus.

In order to maximise the impact of school led research we need to move towards a model where there is a common focus of inquiry that has many stakeholders One way of doing that is to:

  • - Establish an issue to be solved that is aligned with whole school vision.
  • - work with HEI to survey the literature around this area and help design methodologically robust approaches.
  • - Opportunities given for practitioner research embedded across all departments/faculties not just a self-selected few.
  • - Involve the student body with this focus using student journal clubs.
  • - Establish a Research Centre to act as a conduit.
  • - Build in time for staff to conduct research and disseminate findings.

If we are going to maximise the impact of research in schools then it needs to be more than a clandestine bunch of mavericks practicing some kind of weird alchemy that no-one even understands (especially themselves.)

Opportunities need to be given for practitioner-led research that is aligned with a clearly defined whole school vision of improvement, that is well communicated and where all staff feel they can have an impact.

Podcast no. 6 – Glenn Whitman “Can you change the culture of a school through research?”

In October I went to visit Glen Whitman at St. Andrew’s School near Washington. He is the Dean of Studies there but is also a force of nature who runs an in-house research centre that has completely transformed the culture of the school. I asked him how he did it.


Glen Whitman



For more on Glenn see him speak here at the Center for American Progress.