Improving reading from Clackmannanshire to West Dunbartonshire

In the 1990s, two different studies began tracking the outcomes of reading interventions in Scottish schools.   One, run by Joyce Watson and Rhona Johnston then from the University of St Andrews, started in 1992/3 in schools in Clackmannanshire, which hugs the River Forth, just to the east of Stirling. The other began in 1998 in West Dunbartonshire, with the Clyde on side and Loch Lomond on the other, west of Glasgow. It was led by Tommy MacKay, an educational psychologist with West Dunbartonshire Council, who also lectured in psychology at the University of Strathclyde.

I’ve blogged about the Clackmannanshire study in more detail here. It was an experiment involving 13 schools and 300 children divided into three groups, taught to read using synthetic phonics, analytic phonics or analytic phonics plus phonemic awareness. The researchers measured and compared the outcomes.

The West Dunbartonshire study had a more complex design involving five different studies and ten strands of intervention over ten years in all pre-schools and primary schools in the local authority area (48 schools and 60 000 children). As in Clackmannanshire, analytic phonics was used as a control for the synthetic phonics experimental group. The study also had an aim; to eradicate functional illiteracy in school leavers in West Dunbartonshire. It very nearly succeeded; Achieving the Vision, the final report, shows that by the time the study finished in 2007 only three children were deemed functionally illiterate. ( Thanks to @SaraJPeden on Twitter for the link.)

Five studies, ten strands of intervention

The main study was a multiple-component intervention using cross-lagged design. Four supporting studies were;

  • Synthetic phonics study (18 schools)
  • Attitudes study (24 children from earlier RCT)
  • Declaration study (12 nurseries & primaries in another education authority area
  • Individual support study (24 secondary pupils).

The West Dunbartonshire study was unusual in that it addressed multiple factors already known to impact on reading attainment, but that are often sidelined in interventions focusing on the mechanics of reading. The ten strands were (p.14);

Strand 1: Phonological awareness and the alphabet

Strand 2: A strong and structured phonics emphasis

Strand 3: Extra classroom help in the early years

Strand 4: Fostering a ‘literacy environment’ in school and community

Strand 5: Raising teacher awareness through focused assessment

Strand 6: Increased time spent on key aspects of reading

Strand 7: Identification of and support for children who are failing

Strand 8: Lessons from research in interactive learning

Strand 9: Home support for encouraging literacy

Strand 10: Changing attitudes, values and expectations

Another unusual feature was that the researchers were looking not only for statistically significant improvements in reading, but wider significant improvements;

statistical significance must be viewed in terms of wider questions that were primarily social, cultural and political rather than scientific – questions about whether lives were being changed as a result of the intervention; questions about whether children would leave school with the skills needed for a successful career in a knowledge society; questions about whether ‘significant’ results actually meant significant to the participants in the research or only to the researcher.” (p.16)

The researchers also recognized the importance of ownership of the project throughout the local community, everyone “from the leader of the Council to the parents and the children themselves identifying with it and owning it as their own project”. (p.7)

In addition they were aware that a project following students through their entire school career would need to survive inevitable organisational challenges. Despite the fact that West Dunbartonshire was the second poorest council in Scotland, the local authority committed to continue funding the project;

The intervention had to continue and to succeed through virtually every major change or turmoil taking place in its midst – including a total restructuring of the educational directorate, together with significant changes in the Council. (p.46)

Results

 The results won’t surprise anyone familiar with the impact of synthetic phonics; there were significant improvements in reading ability in children in the experimental group. What was remarkable was the impact of the programme on children who didn’t participate. Raw scores for pre-school assessments improved noticeably between 1997 and 2006 and there were many reports from parents that the intervention had stimulated interest in reading in older siblings.

One of the most striking results was that at the end of the study, there were only three pupils in secondary schools in the local authority area with reading ages below the level of functional literacy (p.31). That’s impressive when compared to the 17% of school leavers in England considered functionally illiterate. So why hasn’t the West Dunbartonshire programme been rolled out nationwide? Three factors need to be considered in order to answer that question.

1.What is functional literacy?

The 17% figure for functional illiteracy amongst school leavers is often presented as ‘shocking’ or a ‘failure’ on the part of the education system. These claims are valid only if those making them have evidence that higher levels of school-leaver literacy are attainable. The evidence cited often includes literacy levels in other countries or studies showing very high percentages of children being able to decode after following a systematic synthetic phonics (SSP) programme. Such evidence is akin to comparing apples and oranges because:

– Many languages are orthographically more transparent than English (there’s a higher direct correspondence between graphemes and phonemes). The functional illiteracy figure of 17% (or thereabouts) holds for the English-speaking world, not just England, and has done so since at least the end of WW2  – and probably earlier given literacy levels in older adults.  (See Rashid & Brooks (2010) and McGuinness (1998).)

– Both the Clackmannanshire and West Dunbartonshire studies resulted in high levels of decoding ability. Results were less stellar when it came to comprehension.

– It depends what you mean by functional literacy. This was a challenge faced by Rashid & Brooks in their review; measures of functional literacy have varied, making it difficult to identify trends across time.

In the West Dunbartonshire study, children identified as having significant reading difficulties followed an intensive 3-month individual support programme in early 2003. This involved 91 children in P7, 12 in P6 and 1 in P5. By 2007, 12 pupils at secondary level were identified as still having not reached functional literacy levels; reading ages ranged between 6y 9m and 8y 10m (p.31). By June 2007, only three children had scores below the level of functional literacy. (Two others missed the final assessment.)

The level of functional literacy used in the West Dunbartonshire study was a reading age of at least 9y 6m using the Neale Assessment of Reading Ability (NARA-II). I couldn’t find an example online, but there’s a summary here. The tasks are rather different to the level 1 tasks in National Adult Literacy survey carried out in the USA in 1992 (NCES p.86).

A reading/comprehension age of 9y 6m is sufficient for getting by in adult life; reading a tabloid newspaper or filling in simple forms. Whether it’s sufficient for doing well in GCSEs (reading age 15y 7m ), getting a decent job in later life, or having a good understanding of how the world works is another matter.

2. What were the costs and benefits?

Overall, the study cost £13 per student per year, or, 0.5% of the local authority’s education budget (p.46), which doesn’t sound very much. But for 60 000 students over a ten year period it adds up to almost £8m, a significant sum. I couldn’t find details of the overall reading abilities of secondary school students when the study finished in 2007, and haven’t yet tracked down any follow-up studies showing the impact of the interventions on the local community.

Also, we don’t know what difference the study would have made to adult literacy levels in the area. Adult literacy levels are usually presented as averages, and in the case of the US National Adult Literacy survey included those with disabilities. Many children with disabilities in West Dunbartonshire would have been attending special schools and the study appears to have involved only mainstream schools.  Whether the impact of the study is sufficient to persuade cash-strapped local authorities to invest in it is unclear.

3. Could the interventions be implemented nationwide?

One of the strengths of Achieving the Vision is that it explores the limitations of the study in some detail (p.38ff). One of the strengths of the study was that the researchers were well aware of the challenges that would have to be met in order for the intervention to achieve its aims. These included issues with funding; the local Council, although supportive, was working within a different funding framework to the Scottish Executive Education Department. The funding issues had a knock-on impact on staff seconded to the project – who had no guarantee of employment once the initial funding ran out. The study was further affected by industrial action and by local authority re-structuring. How many projects would have access to the foresight, tenacity and collaborative abilities of those leading the West Dunbartonshire initiative?

Conclusion

The aim of the West Dunbartonshire initiative was to eradicate functional illiteracy in an entire local authority area. The study effectively succeeded in doing so – in mainstream schools, and if a functional illiteracy level is considered to be below a reading/ comprehension age of 9y 6m. Synthetic phonics played a key role.  Synthetic phonics is frequently advocated as a remedy for functional illiteracy in school leavers and in the adult population. The West Dunbartonshire study shows, pretty conclusively, that synthetic phonics plus individual support plus a comprehensive local authority-backed focus on reading, can result in significant improvements in reading ability in secondary school students. Does it eradicate functional illiteracy in school leavers or in the adult population?  We don’t know.

References

MacKay, T (2007).  Achieving the Vision: The Final Research Report of the West Dunbartonshire Literacy Initiative.

McGuinness, D (1998). Why Children Can’t Read and What We Can Do About It. Penguin.

NCES (1993). Adult Literacy in America. National Center for Educational Statistics.

Rashid, S & Brooks, G (2010). The levels of attainment in literacy and numeracy of 13- to 19-year-olds in England, 1948–2009. National Research and Development Centre for adult literacy and numeracy.

Johnston, R & Watson, J (2005). The Effects of Synthetic Phonics teaching on reading and spelling attainment: A seven year longitudinal study, The Scottish Executive website. http://www.gov.scot/Resource/Doc/36496/0023582.pdf

 

 

 

 

Advertisements

Science, postmodernism and the real world

In a recent blogpost Postmodernism is killing the social sciences, Eoin Lenihan recommends that the social sciences rely on the scientific method “to produce useful and reliable evidence, or objective truths”.  Broadly, I agree with Eoin, but had reservations about the ‘objective truths’ he refers to. In response to a comment on Twitter I noted;postm quote 1

which was similar to a point made by Eoin, “postmodernism originally was a useful criticism of the Scientific Method or dominant narratives and a reminder of the importance of accounting for the subjective experiences of different people and groups.”

Ben Littlewood took issue with me;

quote 2

In the discussion that followed I said science couldn’t claim to know anything for sure. Ben took issue with that too. The test question he asked repeatedly was:

flat earth

simple question

For Ben,

facts

Twitter isn’t the best medium for a discussion of this kind, and I suspect Ben and I might have misunderstood each other. So here, I’m setting out what I think. I’d be interested in what he (and Eoin) has to say.

reason and observation

Something that has perplexed philosophers for millennia is what our senses can tell us about the world. Our senses tell us there’s a real world out there, that it’s knowable, and that we all experience it in more or less the same way. But our senses can deceive us, we can be mistaken in our reasoning, and different people can experience the same event in different ways. So how do we resolve the tension between figuring out what’s actually out there and what we perceive to be out there, between reason and observation, rationalism and empiricism?

Human beings (even philosophers) aren’t great at dealing with uncertainty, so philosophers have tended to favour one pole of the reason-observation axis over the other. As Karl Popper observes in his introduction to Conjectures and Refutations, some (e.g. Plato, Descartes, Spinoza, Leibnitz) have opted for the rationalist view, in contrast to, for example, Aristotle, Bacon, Locke, Berkeley, Hume and Mill’s empiricism.  (I refer to Popper throughout this post because of his focus on the context and outcomes of the scientific method.)

The difficulty with both perspectives, as Popper points out, is that philosophers have generally come down on one side or the other; either reason trumps observation or vice versa. But the real world isn’t like that; both our reason and our observations tend to be flawed, and both are needed to work out what’s actually out there, so there’s no point trying to decide which is superior. The scientific method developed largely to avoid the errors we tend to make in reasoning and observation.

hypotheses and observations

The scientific method tests hypotheses against observations. If the hypothesis doesn’t fit the observations, we can eliminate it from our enquiries.

It’s relatively easy to rule out a specific hypothesis – because we’re matching only one hypothesis at a time to observations.   It’s much more difficult to come up with an hypothesis that turns out to be a good fit with observations – because our existing knowledge is always incomplete; there might be observations about which we currently have no knowledge.

If  an hypothesis is a good fit with our observations, we can make a working assumption that the hypothesis is true – but it’s only a working assumption. So the conclusions science draws from hypotheses and observations have varying degrees of certainty. We have a high degree of certainty that the earth isn’t flat, we have very little certainty about what causes schizophrenia, and what will happen as a consequence of climate change falls somewhere between the two.

Given the high degree of certainty we have that the earth isn’t flat, why not just say, as Ben does, that we’re certain about it and call it an objective fact? Because doing so in a discussion about the scientific method and postmodernism, opens a can of pointless worms. Here are some of them.

-What level of certainty would make a conclusion ‘certain’? 100%, 75%, 51%?

-How would we determine the level of certainty? It would be feasible to put a number on an evaluation of the evidence (for and against) but that would get us into the kind of arguments about methodology that have surrounded p values. And would an hypothesis with 80% support be considered certain, whereas a competing hypothesis with only 75% support might be prematurely eliminated?

-Who would decide whether a conclusion was certain or not? You could bet your bottom dollar it wouldn’t be the people at the receiving end of a morally suspect idea that had nonetheless reached an arbitrary certainty threshold.  The same questions apply to deciding whether something is a ‘fact’ or not.

-Then there’s ‘objectivity’. Ironically, there’s a high degree of certainty that objectivity, in reasoning and observation, is challenging for us even when armed with the scientific method.

life in the real world

All these problematic worms can be avoided by not making claims about ‘100% certainty’ and ‘objective facts’ in the first place.  Because it’s so complex, and because our knowledge about it is incomplete, the real world isn’t a 100%-certain-objective-fact kind of a place. Scientists are accustomed to working with margins of error and probabilities that would likely give philosophers and pure mathematicians sleepless nights. As Popper implies in The Open Society and its Enemies the human craving for certainty has led to a great deal of knowledge of what’s actually out there, but also to a preoccupation with precise definitions and the worst excesses of scholasticism – “treating what is vague as if it were precise“.*

I declined to answer Ben’s ‘simple question’ because in the context of the discussion it’s the wrong kind of question. It begs further questions about what is meant by certainty, objectivity and facts, to which a yes/no answer can’t do justice. I suspect that if I’d said ‘yes, it is certain that the earth isn’t flat’, Ben would have said ‘there you are, science can be certain about things’ and the can of pointless worms would have been opened. Which brings me on to my comment about postmodernism, that the root cause of postmodernism was the belief that science can produce objective truth.

postmodernism, science and objective truth

The 19th and 20th centuries were characterised by movements in thinking that were in large part reactions against previous movements. The urbanisation and mechanisation of the industrial revolution prompted Romanticism. Positivism (belief in verification using the scientific method) was in part a reaction to Romanticism, as was Modernism (questioning and rejecting traditional assumptions). Postmodernism, with its emphasis on scepticism and relativism was, in turn, a reaction to Modernism and Positivism, which is why I think claims about objective truth (as distinct from the scientific method per se) are a root cause of postmodernism.

I would agree with Eoin that postmodernism, taken to its logical conclusion, has had a hugely detrimental impact on the social sciences. At the heart of the problem however, is not postmodernism as such, but the logical conclusion bit. That’s because the real world isn’t a logical-conclusion kind of a place either.   I can’t locate where he says it, but at one point Popper points out that the world of philosophy and mathematicians (and, I would add, many postmodernists) isn’t like the real world. Philosophy and mathematics are highly abstracted fields. Philosophers and mathematicians explore principles abstracted from the real world. That’s OK as far as it goes. Clearing away messy real-world complications and looking at abstracted principles has resulted in some very useful outcomes.

It’s when philosophers and mathematicians start inappropriately imposing on the real world ideas such as precise definitions, objective truths, facts, logical conclusions and pervasive scepticism and relativism that things go awry, because the real world isn’t a place where you can always define things precisely, be objective, discover true truths, follow things to their logical conclusion, nor be thoroughly sceptical and relativistic. Philosophy and mathematics have made some major contributions to the scientific method obviously, but they are not the scientific method. The job of the scientific method is to reduce the risk of errors, not to reveal objective truths about the world. It might do that, but if we can’t be sure whether it has or not, it’s pointless to make such claims. It’s equally pointless to conclude that if we can’t know anything for certain, everything must be equally uncertain, or that if everything is relative, everything has equal weight. It isn’t and it doesn’t.

My understanding of the scientific method is that it has to be fit for purpose; good enough to do its job. Not being able to define everything exactly, or arrive at conclusively objective truths, facts and logical conclusions doesn’t mean that we can be sure of nothing. Nor does it mean that anything goes. Nor that some sort of ‘balance’ between positivism and postmodernism is required.

We can instead, evaluate the evidence, work with what conclusions appear reasonably certain, and correct errors as they become apparent. The simple expedient of acknowledging that the real world is complex and messy but not intractably complex and messy, and the scientific method can, at best, produce a best guess at what’s actually out there, bypasses pointless arguments about exact definitions, objectivity, truth and logicality. I’d be interested to know what Ben thinks.

Note

* Popper is quoting FP Ramsay, a close friend of Wittgenstein (The Open Society and its Enemies, vol II, p. 11)

References

Popper K. (2003).  The Open Society and its Enemies vol. II: Hegel and Marx, Routledge (first published 1945).

Popper, K. (2002).  Conjectures and Refutations, Routledge (first published 1963).