AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
Why teachers need to embrace complexity in education
1. Silver bullets, magic wands and tooth fairies:
Why teachers and education researchers
need to embrace complexity (and how)
James Mannion
Director, Rethinking Education
Associate, UCL Institute of Education
PhD candidate, University of Cambridge
james@rethinking-ed.org @rethinking_ed
2.
3. “Learning to Learn. It isn’t even a thing. We’ve
been hoaxed. Again!”
“The hipsters are selling snake oil on this one,
whether they know it or not.”
‘Learning to Learn to Learn to Learn…’
4.
5. Who is the emperor?
1. Ben Goldacre?
2. Michael Gove? Nick Gibb?
3. The Education Endowment Foundation?
4. The Randomised Controlled Trial
(in education?)
5. All of the above?
6. • A journey into complexity
• What is a complex intervention?
• Complex interventions in education (or the lack of)
• What is complexity theory and what does it have to do with education?
• Some problems with the EEF toolkit
• Some problems with EEF RCT programme
• So things are complex? Where do we go from here?
Some things I might talk about
8. What is a complex intervention?
• An intervention with a no. interacting components
• Takes account of the no. & difficulty of behaviours of those delivering
and/or receiving intervention
• No. groups or organisational levels may be targeted
• Generate a no. of different kinds of outcomes
• A degree of flexibility or tailoring of the intervention is permitted to
respond to contextual nuance
– Craig et al. BMJ (2008)
Riley (1999) LEA effectiveness research
• 5 important factors identified
• Individually, predict 35%
of variation
• Taken together: >50%
• Marginal gains theory
9. What is a complex intervention?
“Complex interventions are widely used in the health service, in public health
practice, and in areas of social policy such as education, transport and
housing that have important health consequences. Conventionally defined as
interventions with several interacting components, they present a number of
special problems for evaluators, in addition to the practical and methodological
difficulties that any successful evaluation must overcome. Many of the extra
problems relate to the difficulty of standardising the design and delivery of the
interventions, their sensitivity to features of the local context, the
organisational and logistical difficulty of applying experimental methods to
service or policy change, and the length and complexity of the causal chains
linking intervention with outcome.”
- UK Medical Research Council (2006)
10. • Pain is a complex, subjective experience that
involves sensory, emotional, behavioural factors
associated with actual or potential tissue damage.
• Postoperative pain influenced by many factors:
• Global – e.g. personality, age, gender,
cultural background, pre-existing pain
syndromes, genetic make-up, type of surgery
• Specific – e.g. fear, anxiety, depression,
anger, coping
• Monotherapy (e.g. morphine) is v limited, also side
effects with high doses
• Multimodal analgesia uses a combination of
analgesics and techniques that act on multiple
pain pathways
An example of a complex intervention
11.
12. Learning Skills at Sea View:
A 5-year impact evaluation
• 2010: Secondary comp Y7 ‘Learning to Learn’ curriculum
• Competitive selection process team of 5 committed people
• Combined approach
• Taught explicitly through timetabled lessons (Y7 8 9)
• Embedded throughout the school (tutor time, lessons, rewards, CPD)
• Decided against “buying in” an existing programme / materials
• Instead, reasoning from first principles…
L2L as a complex intervention
13. Component of L2L at Sea View Supporting literature
Y7, 8
Self-regulation
(project-based learning)
Barron & Darling-Hammond (2008);
Dignath et al (2008); Hung (2008)
Y7, 8
Collaboration (paired group,
familiar unfamiliar)
Howe (2009, 2010) ; Slavin (2010);
Laughlin, Hatch, Silver & Boh (2006)
Y7, 8, 9
Oracy (paired talk, P4C, formal
debates, public speaking…)
Littleton & Mercer (2013); Gorard et al (2015);
Topping & Trickey (2007a, 2007b);
Y7, 8, 9
Formative assessment
(comment-based feedback)
Black & Wiliam (1998); Fuchs & Fuchs (1986);
Hattie (1992); Higgins, Kokotsaki & Coe (2012)
Y7, 8, 9
Metacognition (language of
learning, plenaries, journals)
Chiu (1998); Haller, Child & Wahlberg (1988);
Whitebread & Pino Pasternak (2010)
Y8
Personal effectiveness
(organisational skills)
Harrison, James & Last (2012)
Y9
Thinking & reasoning skills
(critical thinking, debating)
Halpern (1998); Moseley et al. (2005)
Whole-school
Growth mindset (shared
language of learning)
Claxton et al. (2011); Dweck (2006); Perkins (1995)
Whole-school Transfer (managed approach) Engle (2006); Hipkins & Cowie (2014)
Whole-school
Action Research as CPD
(collaborative inquiry)
Bell et al (2010); Crippin et al. (2010);
Joyce & Showers (2002); Timperley et al. (2007)
14. The golden thread
1. What is learning?
• Acquisition of knowledge and skills
• A change in Long Term Memory
2. What is learning to learn?
• The process of becoming a more effective learner
• A taught course, and a joined-up, whole-school approach to teaching and learning
3. What is transfer?
• A vital ingredient in ensuring that skills and dispositions do not remain ‘context-bound’
• A combination of transfer out (of L2L) and transfer in (to other subjects)
4. How will this lead to improved outcomes?
• More effective learners should be able to access higher grades in subject assessments
• If subject learning doesn’t improve, L2L obviously isn’t achieving what it set out to do
15. Learning Skills
at Sea View:
outcomes
After 1 year:
• Gains in CAT scores
(pre vs post year 7)
• Fewer behaviour points
(vs. previous cohort)
• Curriculum expanded Y8
After 3 years:
• Improved attainment
(all subjects combined)
• Closing of the Pupil Premium
gap (from the bottom up)
18. Learning Skills at Sea View
(5 year outcomes)
• Best set of GCSE results in the school’s history
• By far the greatest reduction in Pupil Premium gap of any school in
the city (in a year when the gap increased across the city as a
whole)
Also a range of qualitative measures:
• Student interviews
• Teacher interviews
• Students’ Learning Journal entries
• Student questionnaires (designed and existing psychometrics:
gains in personal growth, curiosity & exploration)
• Ofsted reports, HMI feedback
• Observational data
19. Complex interventions in
medicine and education
Search: "complex intervention" OR "multimodal intervention”
• pubmed (medical search engine): 1086 hits
• eric.ed.gov (educational search engine): 129 hits
• Interestingly, a number of the Pubmed hits relate to
medical / health education
• Whereas…
20. 0 10 20 30 40
Clinical (children)
Special educational needs
Research methods / evaluation
Education of health…
Clinical (adult)
Social work
School attendance
Literacy interventions
Education technology
Sex education
Increasing STEM/Science…
Test anxiety
Cooking/gardening…
Artist in residence
Use of data
Mathematics
Procrastination
Behaviour management
Of the 129 education papers on complex / multimodal interventions…
• SEND and literacy papers often use ‘multimodal’
to mean ‘multisensory’
• Several use ‘multimodal’ to mean multiple
modalities (individual, group, family etc)
• Some are theoretical papers
• The most explicit use of “complex intervention” is
Ann Brown’s (1992) paper ‘Design experiments:
theoretical and methodological challenges in
creating complex interventions in classroom
settings’
Complex settings (i.e. classrooms)
Expresses “the need for new and complex
methodologies to capture the systemic
nature of learning, teaching and
assessment” (p.174)
21. What is complexity, and what does
it have to do with education?
My (layperson) notion of complexity
1. Metaphysical complexity (Searle, 1995)
2. Multifactorial complexity (Rumsfeld, whenever that was)
3. Methodological complexity (Brown, 1992)
22. What is complexity theory, and what
does it have to do with education?
• Related to chaos theory – the study of dynamic, deterministic systems (i.e.
future behaviour is determined by initial conditions) where initial conditions
yield wildly unpredictable future events
(the butterfly effect)
• Complexity theory – the study of non-deterministic, dynamic networks.
They contain a number of constituent parts that interact and adapt to each
other over time. A key feature is the concept of emergence – (e.g. the
emergence of consciousness from brain jelly)
• Complexity theory gives us a lens through which we can consider how
complex, adaptive systems (CASs) work.
23. What is a ‘Complex Adaptive System’?
Cilliers (1998) characterises a CAS as having:
• A large number of elements with many interactions
• Interactions which are non-linear, i.e. large-scale causes can have small
impacts and vice versa
• Interactions which lead to feedback loops, both positive and negative. Also
things like ‘lock in’ (e.g. the Qwerty keyboard)
• An ‘open’ system, having interactions with elements in external
environments beyond the immediate system
• Elements which interact with their environment making the identification of
boundaries difficult
• A system which is far from equilibrium and therefore needs a constant
energy flow for it to operate;
• The importance of history, past processes playing a role in forming the
present, often unpredictably
• Each element only acting on local information rather than information from
the whole system
24. So what?
• Students, groups of students, classrooms, year groups, schools, academy
chains, the wider system etc – can be seen as complex adaptive systems.
• When it comes to educational change, complexity theory tells us that we
should intervene at every possible level, and from every possible angle. It
points us toward integrated service delivery.
• This means coming at the problem from as many angles as you can. Not
using phrases like ‘levers of change’ and ‘silver bullets’.
• In a complex, dynamic, non-linear system, there’s no such thing as a magic
bullet. There is no single lever of change that will do what you want it to do.
Professor Mark Mason (Hong Kong Institute of Education)
25. 1. Teaching and Learning Toolkit
2. Hundreds of RCTs
What’s this got to do with the EEF?
26. Some problems with the Teaching
and Learning Toolkit
1. Learning measured in months – a linear (not to mention
nonsensical) metaphor
2. Based on the notion of a causal, linear input-output model: “do this, and you’ll
get X months of progress” (“More akin to pig farming thanthe complex
endeavour of education” – Biesta)
3. Ranks in a league table, but does not compare like with like. The studies
behind it vary enormously.
4. It presents education as a series of abstract concepts and interventions –
bullets of varying quality – and overlooks the most important things – people,
and the importance of implementation.
5. Promoting such a simple understanding of educational change is likely to do
as much harm as good (e.g. feedback)
27. Top of the league!
Distribution of effect sizes from 607 feedback-
based interventions (Kluger & DeNisi, 1996)
28. Using Randomised Controlled
Trials in education
“Schools and classrooms are complex systems, and within them it is very hard to
distinguish cause from effect. What is required is sound research to determine causal
relationships between teaching strategies and pupil progress. This is where Randomised
Controlled Trials (RCTs) have such an important role to play. They enable us to test out
high-promising ideas in the real world of the classroom, giving us a robust estimate of
the impact of a programme by comparing the outcomes of students who received it with
a control group of students who didn’t.
Six years ago, RCTs were almost unheard of in education in this country. Since 2011, the
EEF has committed funding for rigorous and independent evaluations of over 130
different programmes.
The evidence generated by robust trials means teachers today are in a much better
position to judge what is likely to work and not work in their classroom than they were
even a few years ago.”
Sir Kevan Collins (EEF Chief Exec, 2017; Impact magazine)
29. Some problems with using Randomised
controlled trials in education
• RCT vs RDBPCT
• Randomised, Double Blind, Placebo Controlled Trial
• Single blind = the patient doesn’t know whether they’re getting the drug or the placebo
• Double blind = the medics don’t know either (the drugs are coded)
• Placebo-controlled = the control patients receive a sugar pill
These two features of medical trials attempt to overcome the many confounding
“effects” that can arise when you run a trial
• Placebo effect Taking a sugar pill can have a large positive effect
• Hawthorne effect Being studied improves performance
• Pygmalion effect Higher expectations self-fulfilling prophecy
• John Henry effect Opposite of Hawthorne (control group tries harder)
• Novelty effect Novelty is exciting, for a while
• Experimenter effect Can be positive or negative
AND MANY OTHERS!
30. Some problems with using Randomised
controlled trials in education
• To double blind an education trial, the
teachers would literally have to not
know what they were doing!
• To placebo control an education trial, you’d have to have a
school doing busy work that you were sure would have zero
impact on their learning. So all we’re left with is RCT
• The ‘Control’ arm in an education trial is not very controlled at all – 25 control schools
would have 25 different control states – each dynamic, complex and unpredictable –
not a homogeneous point of comparison. Also, control schools often given much less /
no money compared with control school.
• Also, Randomisation can be problematic – e.g. the EEF Philosophy for Children trial –
by pure bad luck, the treatment group had much lower prior attainment – the
subsequent gains could just as easily be explained by regression to the mean.
• Other methodological concerns about EEF trials – the exclusive focus on Maths and
English results, some small effect sizes reported favourably, protocol not followed /
changed after the event…
31. Some problems with using Randomised
controlled trials in education
RCTs in education are based on a set
of faulty assumptions:
• Learning is linear (it isn’t)
• Learning responds to an input-output model (it doesn’t)
• Schools are stable and behave in predictable ways
(they aren’t, and they don’t)
• You can determine causality from a positive RCT result (you can’t)
• You can isolate a single factor within a complex system and
determine its relationship to an outcome (you can’t)
• Implementation across the treatment arms is comparable
(high fidelity is extremely rare)
32. Some problems with using Randomised
controlled trials in education
They tell you nothing about implementation.
All an RCT can tell you is whether an effect would be likely to have
happened anyway, in the absence of intervention X.
To the extent that this is useful, it only tells you whether intervention X
“worked” – there is no guarantee as to whether it will work in your context.
The difference between good and bad implementation of an
intervention (e.g. feedback) is far greater than any effect size determined by
an education RCT.
So, what’s the point of them? Are they worth the money?
What can be said for the use of RCTs in education?
33. Some objections to of using Randomised
controlled trials in education
• “But the EEF have found some positive results!”
If you measure a bunch of stuff, it usually follow a
normal (bell-shaped) distribution curve. Measure 100
blades of grass and 5% of them will be freakily tall.
To date the EEF has completed 76 evaluations of
interventions. Of these, 13 are deemed “promising”,
including the P4C trial.
Given the methodological issues with dong RCTs in
education: how much faith can we have in these
findings? Could there be false positives and false
negatives in this data set?
• “At least they’re trying. What do you propose? Should we just throw
our hands in the air and say ‘it’s all so complex’?”
34. Where does all this leave us?
We can learn from medicine – but with
regard to complexity, not RCTs
School leaders & teachers
• Train all school leaders in implementation science
• Train all school leaders in robust, multi-level, in-school impact evaluation (in the
absence of which, we have no idea whether what we’re doing is having an
impact, making no difference – or making things worse)
• Schools leaders working collaboratively across schools as v. critical friends
• Embed notions of integrated service design – multiple parts pulling in the same
direction, working in harmony
• Metaphor of a symphony (e.g. free schools – School21, Michaela)
• Figure out how to get existing schools to become more harmonious
Education researchers
• It is possible to do RCTs of complex interventions (see MRC guidelines)
• Use a wide spectrum of research approaches – researchers working in
multidisciplinary teams (like they do at Harvard Medical School )
• Develop a language of complexity that is accessible to teachers. Complexity
does not have to be complicated!
35. Silver bullets, magic wands and tooth fairies:
Why teachers and education researchers
need to embrace complexity (and how)
James Mannion
Director, Rethinking Education
Associate, UCL Institute of Education
PhD candidate, University of Cambridge
james@rethinking-ed.org @rethinking_ed