seven myths about education: the myths

Well, I’ve finally been and gone and read Daisy Christodoulou’s book Seven Myths about Education. Overall, her argument goes as follows;

• the English education system is dominated by a certain set of ideas
• the ideas can be epitomised as seven ‘myths’
• cognitive science demonstrates that the myths are wrong.

Broadly speaking, a challenge to the dominant orthodoxy of the education system is certainly overdue and cognitive science is a good place to start. But when it comes down to specifics I felt that Daisy’s analysis of the ideas, her understanding of the grounds for challenging them, and the conclusions she draws don’t stand up to scrutiny. The discrepancy between the surface plausibility of the arguments and their underlying structure would explain why this book been both lauded and criticised. Whether you laud it or criticise it will depend on the level at which you read it.

the English education system is dominated by a certain set of ideas

The evidence from theory and practice the author sets out supports her thesis that some ideas predominate in educational theory and that teachers are encouraged, if not pressurised, into implementing those ideas. But that’s not all there is to it; there are things missing from the analysis. The English education system is complex, so the quality of education students get is dependent on a range of factors. These include not only the ideas that shape the content of teacher training, the content of the curriculum and the criteria used in Ofsted inspections, but the structure of the system itself, the framework of accountability and expectations about what the system should achieve. No author could tackle everything in one book, of course, but the ideas that shape teacher training and practice need to be assessed in the context of the system as a whole, so a brief explanation of Daisy’s view of the other factors would have been helpful.

the ideas can be epitomised as seven ‘myths’

The myths are;

1. facts prevent understanding
2. teacher-led instruction is passive
3. the 21st century fundamentally changes everything
4. you can always just look it up
5. we should teach transferable skills
6. projects and activities are the best way to learn
7. teaching knowledge is just indoctrination

The structure of the book is clear; one chapter is devoted to each myth and each of the myth chapters is divided into three sections – ‘theoretical evidence’, ‘modern practice’ and ‘why is it a myth’? Unfortunately the same degree of clarity doesn’t apply to the analysis of the ideas. Three tendencies muddy the water;

• a failure to make a clear distinction between theory, opinion and practice
• treating ideas that bear a passing resemblance to a myth as equivalent to the myth itself
• assuming that subscribing to an idea that resembles one myth implies subscribing to other myths.

a distinction between theory, opinion and practice

For some myths (3, 4, 5 and 6) the only difference between the theoretical evidence and the modern practice described is that the two sections contain different quotations – the sources are the same. This might be because the myths in question don’t have a theoretical basis; we’re not told. But given the author’s claim that she’s interested in tracing ideas (p.6) her failure to identify the roots of some of the myths is disappointing. An exploration of their origins might have shed some light on why they’ve been adopted.

ideas that bear a passing resemblance to a myth equated with the myth itself

For most of the myths, several examples of theory and practice are about ideas related to the myth, not the myth itself. For example, questioning the reliability or validity of facts is equated to ‘facts prevent understanding’; calling for holistic and coherent curriculum content to ‘projects and activities are the best way to learn’; and advocating a degree of autonomy in learning to ‘teaching knowledge is just indoctrination’. This conflation would account for the ‘illogical’ criticism Daisy complains about on her blog – people claiming that the myths don’t exist whilst simultaneously agreeing that she has found examples of them presented as best practice. If several related but different ideas are being conflated and treated as one, it’s not surprising that confusion has followed.

subscribing to an idea that resembles one myth implies subscribing to other myths

In several chapters the theoretical evidence refers to myths and related ideas other than the one the chapter purports to be about. The theoretical evidence for myth 2, ‘teacher-led instruction is passive’, refers to children’s difficulties with constant questions and with learning to read, interdisciplinary learning and the power relationship between pupil and teacher, rather than passivity. Evidence for myth 7, ‘teaching knowledge is just indoctrination’, includes questioning the objectivity of facts and advocating interdisciplinary activities and projects, rather than teachers indoctrinating children.

You could argue that people who subscribe to one myth (or ideas related to it) often do subscribe to other myths (or ideas related to them). But the author’s case rests on evidence of the prevalence of seven quite specific ideas. She also claims to trace those ideas from theory to practice (p.6). Her case would have been stronger if she’d been able to do that with more precision.

Daisy locates the origin of all the myths in postmodernism. She says;

Postmodernism is sceptical about the value of truth and knowledge, and many of these myths have at their heart a deep scepticism about the value of knowledge. It is for this reason that I begin with myth 1 (facts prevent understanding) and 2 (teacher-led instruction is passive). These could be said to be the foundation myths of all the others discussed in this book.” (p.8)

To illustrate how ideas are handled in this book, it’s worth taking a closer look at one of the foundational myths – myth 1 ‘facts prevent understanding’.

facts prevent understanding

Daisy attempts to demonstrate the theoretical basis of the myth ‘facts prevent understanding’ by quoting from Rousseau, Dewey, Freire and Dickens. But the quotations are actually about ideas other than ‘facts prevent understanding’. Rousseau expected children to learn facts via nature rather than formal schooling, Dewey objected to pedagogical methods that prevented children learning, Freire explicitly objects to the ‘banking’ approach in education because it conceals facts from children (Freire p.83) and Dickens’ concern was that facts alone were being taught.

Despite failing to demonstrate that the four authors actually thought that facts prevent understanding, Daisy refers to a ‘common trope’ between them. “They all set up polar opposites between facts, which are generally seen as bad, and something else, which is generally seen as good” (p.13). But they don’t. According to the evidence cited, what the writers objected to was the way facts were presented in schools. The alternatives they proposed might not be any better, but it doesn’t follow that any of them thought that facts, per se, were bad.

The origins of the myth, according to the author, lie with Rousseau. His emphasis was actually on what children could learn from interactions with the harsh reality of nature as distinct from than human interventions that were frequently ineffectual. Although Rousseau’s influence is clearly traceable through to modern educational practice, his underlying idea that understanding is as important as factual knowledge is also exemplified in John Locke (who influenced Rousseau), in the Socratic method and in the books of Proverbs and Ecclesiastes, taking it back to several centuries BC. In other words, a distinction between facts and understanding was around for quite a while before Rousseau appeared on the scene.

Daisy acknowledges “sometimes it is argued that these theorists were not hostile to facts per se, merely to certain prescriptive and artificial methods of learning such facts” (p.13) and says she considers this argument in full in the following chapter. What she actually does in that chapter is to quote Rousseau on endless questions from teachers, children’s curiosity and rote learning, Dewey on the correlation of school subjects, and Freire on the co-construction of learning, none of which says anything about hostility to facts.

She concludes that the national curriculum ‘opposes’ subject content and subject concepts just as Rousseau, Dewey, Freire and Dickens allegedly ‘opposed’ facts with “meaning, understanding, reasoning, significance…imagination or creativity” (p.13). Her evidence from the national curriculum certainly demonstrates a move towards subject concepts at the expense of subject content, but that’s a far cry from propagating the idea that ‘facts prevent understanding’. Yet by the end of the chapter on myth 1, theorists and government agencies are described as ‘sceptical about the value of facts’. By the end of the chapter on myth 2, theorists have become ‘hostile’ to facts. What Daisy does, in effect, is to lump together all ideas that include any reservations whatsoever about factual information, who presents it or how it is presented, and assume that what they all boil down to is a belief that ‘facts prevent understanding’. They don’t, of course.

facts

Facts are a key issue for Daisy. She cites Berger and Luckman’s The Social Construction of Reality as epitomising the thinking of some educational theorists for whom ‘the very concept of knowledge is problematic’ (p.111), and comments;

“…Berger and Luckman looked at the way that many of the facts we perceived to be true were in fact social constructions. They did not objectively exist out there somewhere. They were brought into being because we all believed in them, and very often they were buttressed by institutional power” (p.109). (Daisy’s emphasis).

What she doesn’t appear to have thought through is why anyone could see truth, facts and knowledge as problematic. Yet these concepts have had philosophers, historians, lawyers and scientists scratching their heads for centuries. This isn’t because of hostility to facts – all these disciplines actively seek out facts – but because it’s very difficult for human beings to determine what is true and therefore factual. Each of these disciplines is well aware that facts involve degrees of uncertainty and has had to devise ways of evaluating the reliability and validity of evidence behind the facts. The root of the problem isn’t that some people think that facts do not ‘objectively exist out there somewhere’ but that our awareness of what is objectively ‘out there somewhere’ is at the mercy of our perception, which is notoriously unreliable. Ironically cognitive science has recently begun to identify the mechanisms behind the vagaries of human perception that have been so perplexing for so long.

Much of the information transmitted in schools is backed by pretty solid evidence, so for all intents and purposes we can refer to it as factual; e.g. how photosynthesis works, what happens during volcanic eruptions, where and when the battle of Hastings took place, the rules of algebra. Other information is less certain; how subatomic particles behave, evolution, climate change, the causes of WW1. In the latter examples, trying to determine whether the information is factual or not is unhelpful. It’s more informative to frame it in terms of the reliability and validity of the evidence and what conclusions can be drawn about it. I think Daisy is right that currently these skills might be being introduced prematurely, before children have a sufficient grasp of the data and the structure of the relevant knowledge domain, but sooner or later students need to be introduced to uncertainty in knowledge and how to tackle it. The problematic nature of facts doesn’t mean that all facts are equally problematic. Nor does it mean that they are all equally unproblematic. The factualness of information varies, and students need to know how and why it varies.

The evidence that Daisy presents suggests that social constructivism has had a disproportionate influence on educational theory. That’s not surprising given the importance of social interaction and verbal communication in education; education lends itself to a social constructivist paradigm. But this disproportionate influence has resulted in findings from other relevant knowledge domains relevant to education being overlooked. These include fields relating to child development such as genetics, molecular biology, linguistics and developmental and cognitive psychology, and those relating to structural issues such as organisational psychology and the history of education.

I think Daisy is right to highlight the dominance of certain ideas, but she has oversimplified a complex situation. She’s taken groups of ideas with common themes – such as facts, teacher authority, an integrated curriculum – and assumed that one, often extreme, related idea can exemplify all the ideas in a group. Another oversimplification crops up in relation to cognitive science, the subject of my next post.

the history of reading methods revisited (5)

My response to some of Maggie’s most recent points:

Frank Smith

Maggie: Indeed, he [Smith] was echoing much earlier theorists, such as Huey, in this belief and, of course, by the time he was writing many readers may have been using such strategies because of being taught by Word methods (I’m sticking to my hypothesis!). I can’t find that he has any evidence for his assertion and, as I pointed out, Stanovich and West disproved his theory.

Me: The first five chapters of Snowling & Hume’s book The Science of Reading are devoted to reviews of work on word recognition processes in reading. Most of the research looks at the ways in which adult, expert readers read. What emerges from these five chapters is that:

• expert readers do not use one single method for reading words; they tend to use rapid whole-word recognition for familiar words and slower, stepwise decoding for unfamiliar words;
• the speed with which they respond to target words increases in response to different types of priming;
• the jury is still out on how reading mechanisms actually work.

It was the fact that expert readers use two strategies that resulted in a plethora of ‘dual route’ models of reading; the first was proposed in the 1920s, but studies of brain-damaged patients had noted this in the 19th century. This is exactly what West and Stanovich found. What they ‘disproved’ was that the use of contextual information by children increased with age and reading ability.

There was a great deal of work on priming effects in reading during the 1970s, so although Smith might have been wrong, he wasn’t just ‘echoing earlier theorists’. He had a PhD in psycholinguistics/cognitive psychology from Harvard, so would have been very familiar with the direction of travel in contemporary reading research.

Your hypothesis that expert readers were using mixed methods because that’s how they’d been taught to read, might be right. But a more likely explanation is that recognition of complex sensory stimuli (e.g. words) becomes automated and fast if they are encountered frequently, but requires step-by-step analysis if they’re not. That’s how human brains deal with complex sensory stimuli.

There is no question that expert readers use more than one strategy when reading. The question is whether explicitly learning those strategies is the best way for children to learn to read.

the rejection of the alphabetic principle


Me: Maggie says my statement that the alphabetic principle and analytic phonics had been abandoned because they hadn’t been effective for all children ‘makes no sense at all’. If I’m wrong, why were these methods abandoned?



Maggie: I still don’t think it makes any sense. For a start, you give no time scale. When did this abandonment take place? And you are conflating Alphabetic with Analytic which I don’t think is correct (see my earlier comment).

Me: They were abandoned gradually. My PGCE reading tutor, who trained in the 1930s, was keen on analytic phonics but not on ‘flashcards’. I remember spending hours preparing phonics reading activities. Several teachers of her generation that I’ve spoken to, took a similar view. They didn’t advocate using analytic phonics ‘systematically, first and only’, but as a support strategy if children were struggling to decode a word. Clearly, the teachers I’ve encountered don’t form a representative sample, but some of them were using analytic phonics until they retired and at least one teacher training college in the UK was teaching students to use it until at least the late 1970s. And this definitely wasn’t ‘alphabetic’, it was phonetic. According to my reading tutor, the alphabetic method was widely perceived as flawed by the 1930s. The consensus amongst these teachers was:

• children use a range of strategies when learning to read
• whatever method of teaching reading is used, some children will learn with little effort and others will struggle
• no one method of teaching reading will be effective for all children, but some methods are more effective than others (which is why they still used analytic phonics).

I’m not saying they are right, but that’s what they thought.

Maggie: Another point is that you are crediting educationists and teachers with a degree of rationality which I don’t think is justified. The widespread acceptance of the Word method, which had no evidence to back it but strong appeals to ‘emotion’ with the language of its denigration of Phonic methods, is a case in point. Boring, laborious, ‘drill & kill’, barren, mechanical, uncomprehending, the list is long (and very familiar). It is a technique promoted today as ‘framing’ (though I might acquit its original users of deliberate use of it). Very easy to be persuaded by the language without really considering the validity of the method it purports to describe.

Me: I think you are not crediting them with enough rationality. The ‘drill and kill’ they were referring to was an approach many teachers resorted to in the early days of state education. Those teachers were often untrained, had to teach large numbers of children of different ages, had few books, were on performance related pay, used corporal punishment and had been taught themselves through rote learning entire lessons. Complaints about children being able to recite but having no understanding were commonplace in those early days. What has happened over time is that denigrating rote learning everything (justified in my view) has morphed into denigrating rote learning anything (not justified).

Prior to the 1980s, teachers in the UK were left to their own devices about how they did things, and some at least, took a keen interest in developing their own methods; they didn’t all slavishly follow fashion by any means. I agree that the ‘Word’ method might have been framed emotively, but it’s not true to say there was no evidence to back it.

The evidence was in the form of adult reading strategies. If you’re a teacher who’s seen ‘drill and kill’ not working for all children, then alphabetic and analytic phonics not working for all children, and someone comes along and tells you that scientific research has shown that adults use a range of strategies when reading (and you check out the research and find that indeed it has shown just that) so it would make sense to teach children to use a range of strategies to learn to read, what would you, as a rational person, do?

I think you are seeing claims that adults use a range of reading strategies through the spectacles of the ‘teaching reading’ literature, not through the spectacles of the ‘reading mechanisms’ literature. The body of evidence that supports the idea that adults use a range of strategies in reading is vast. And every teacher will have witnessed children attacking words using a range of strategies. Putting the two ideas together is not unreasonable. It just happens to be wrong, but it wasn’t clear that it was wrong for a very long time.

Maggie: I would also suggest that the discourse of ‘science’, ‘research’, ‘progressive’ would be enough to convince many without them delving too deeply into the evidence. Brain Gym, anybody?

Me: You’re quite right. The point I’m making is that there was robust evidence to support the Word method. But it was robust in respect of people who had learned to read, not those who hadn’t. The way the brain functions after learning something (in adults) doesn’t reflect the way it learns it (in children). But that was by no means clear in the 1970s. There is still a dispute going on about this amongst cognitive scientists.

using a range of cues


Me: The cues I listed are those identified in skilled adult readers in studies carried out predominantly in the post-war period. Maggie’s hypothesis is that the range of cues is an outcome of the way the participants in experiments (often college students) had been taught to read. It’s an interesting hypothesis; it would be great to test it.

Maggie: I stand by it! I have worked with too many children who read exactly as taught by the Searchlights!
I thought I would revisit these ‘cues’ which are supposed to have offered sufficient exposure to auditory and visual patterns to develop automated, fast recognition. They are ‘recognising words by their shape, using key letters, grammar, context and pictures, recognising words by their shape’.

Confounded at once by the fact that many words have the same shape: sack, sick, sock, suck, lack, lick, luck, lock, pock, pick, puck, pack,

using key letters, Would those be the ones that differentiate each word in the above word list?

grammar, Well, I can see how you might ‘predict’ a particular grammatical word form, noun, verb, adjective etc. but the specific word? By what repeated pattern would you develop automatic recognition of it?

context I think the same might apply as for grammar. You need a mechanism for recognising the actual word.

pictures, Hm. Very useful for words like oxygen, air, the, gritty, bang, etc.

Me: Again, you are confusing the strategies adults use when reading with the most effective way of teaching children to read. They are two different things. Your examples illustrate very clearly why using multiple cues isn’t a good way of teaching reading. But those inconsistencies don’t stop adults using these cues in their reading. If you don’t have a copy of Snowling and Hume’s book, get one and read it.

Maggie: In view of Stanovich & West’s findings I would be interested to see any studies which show that skilled adult readers did use the ‘cues’ you listed. (as above)

Me: There’s a vast literature on this. Summarised very well in Snowling and Hume, which is why I’ve recommended it. Incidentally, a ‘cue’ isn’t a term invented by proponents of the Word method, it’s a perfectly respectable word denoting a signal detected in incoming information; it can affect subsequent information.

Me: In chapter 2 of Stanovich’s book, West and Stanovich report fluent readers’ performance being facilitated by two automated processes; sentence context (essentially semantic priming) and word recognition.

Maggie: I appreciate that but this is described as a feature of fluent, skilled reading. To assume that beginning readers do this spontaneously might be to fall into the same trap as ‘assuming that children could learn by mimicking the behaviour of experts’

Me: In your original post, you said “Stanovich and West showed, in the 70s that these were strategies used by unskilled readers and that skilled readers used decoding strategies for word recognition (this is an extreme simplification of the research Stanovich outlines in ‘Progress in Understanding Reading’) and this has been the conclusion of cognitive scientists over the subsequent decades the validity of these strategies is seriously challenged.”

I think you’ve misunderstood what Stanovich and West (and other cognitive scientists) have shown. The literature shows, pretty conclusively, that fluent readers use word recognition first and decoding if word recognition fails. Sentence context isn’t used as a conscious strategy, it’s subconscious, because the content of the sentence increases access to words are semantically related. It’s not safe to assume that because experts do something, novices learn by copying them. Nor is it safe to assume that experts use the same strategies they did when learning as novices.

Me: According to chapter 3, fluent readers use phonological recoding if automated word recognition fails.



Maggie: Isn’t that the whole point. Fluent readers didn’t use context, or other ‘cues’, to identify unfamiliar words, they used phonological recoding.

Me: No. The point is that they used it if automated word recognition failed.

Maggie: It is also moot that they use context to predict upcoming words (although I do understand about priming effects). There is also the possibility that rapid, automatic and unconscious decoding is the mechanism of automatic word recognition (Dehaene). Possibly with context confirming that the word is correct? A reading sequence of ‘predicting’, then, presumably, checking for correctness of form and meaning (how? by decoding and blending?) seems like a strange use of processing when decoding gets the form of the word correctly straight away and immediately activates meaning.

Me: It’s possible that rapid, automatic and unconscious decoding is the mechanism of automatic word recognition but work on masking and priming suggests that readers are picking up the visual features of letters and words as well as their auditory features and semantic features. In other words, there are things going on in addition to decoding.

Whether readers use context to predict upcoming words depends on what you mean by ‘predict’. Priming results in some words being more likely than others to occur in a sentence; this isn’t a conscious process of ‘prediction’ but it is a subconscious process of narrowing down the possibilities for what comes next. But in some sentences you could consciously predict what comes next with a high degree of accuracy.

classical liberal education: the downside

You might be wondering why I’m making such a big deal out of Robert Peal’s arguments. After all, as he points out in his responses to critics, opinions are important and categorisation aids discussion. If Robert were simply voicing his personal opinion to get a discussion going, I probably wouldn’t have commented on his book at all. But he’s not just doing that. Progressively Worse was written in his capacity as Education Research Fellow with the think tank Civitas. The book is published by Civitas and the front cover carries a personal endorsement from the Secretary of State for Education, Michael Gove. Civitas also published Toby Young’s pamphlet Prisoners of the Blob. Young is co-founder of the West London Free School apparently the first free school in the country to sign a funding agreement with the said Secretary of State. Civitas have published a UK version of ED Hirsch’s Core Knowledge Sequence and a series of textbooks and teaching resources. Civitas also runs a network of schools and is described by Core Knowledge UK (‘the official partnership in the UK’ – presumably with the Core Knowledge Foundation) as ‘an educational charity’. And that’s what bothers me.

a classical liberal education

As far as I can gather, a relatively small group of people share an opinion that what the English education system needs is a return to a Classical Liberal Education. I experienced one of these myself, although I wasn’t aware of it at the time. (The ‘classical liberal’ label, that is. I was aware of the education). ‘Classical liberal’, like ‘traditional’ and ‘progressive’, is something of a folk category – a label for a loosely defined group of concepts that’s useful for signposting during conversation. But, as I hope I demonstrated in my previous post, folk classifications aren’t generally up to tasks that require more exact specifications, like making comparisons between individual schools, or designing a classical liberal curriculum, for example. For tasks like that, you need a more precise definition.

In an article in The Telegraph in 2013, Toby Young says the head of the West London Free School asked the governors for “a relatively short statement of what’s meant by a Classical Liberal Education that could be included in the Staff Handbook“. Young then says “This is a phrase we’ve often bandied about, but never tried to define before – at least, not beyond shorthand phrases like ‘the best that’s been thought and said’”.

That’s a revealing remark. It suggests that the governors of a school apparently offering a classical liberal education hadn’t started with the question “What’s the purpose of education?” or “What do our students need to know and why?” but “What should be included in a classical liberal education?” without attempting to actually define it. The governors eventually came up with the ‘relatively short statement’ requested. Young quotes it in his article and you can read it on the school website here.

The relatively short statement looks relatively long to me (almost 800 words). There’s a lot of ‘we mean this, but not that’. A quotation from Daniel Willingham is juxtaposed with an extract from an essay by Bertrand Russell. The necessity for all this explanation suggests that a classical liberal education isn’t easy to define and maybe it would have been better to have side-stepped the definition completely and simply pointed interested parties to a summary of the school curriculum so they could assess it for themselves.

the conversation of mankind

The difficulty in defining a classical liberal education appears to revolve around a core sticking point; what constitutes “the best and most important work in both the humanities and the sciences.” This criterion is derived from a phrase in essay on culture by Matthew Arnold, the 19th century poet, who was also a school inspector. Arnold summarises culture as the ‘best which has been thought and said’. Few people are actually going to disagree with that as a broad aim for what should be taught in schools, but as both Robert Peal and the West London Free School point out, deciding what constitutes ‘the best and most important work’ is not a straightforward task, especially where the humanities are concerned.

The West London Free School statement concludes that what should be in the curriculum is “the background knowledge taken for granted by writers who address the intellectually engaged layman – the shared frames of reference for public discourse in modern liberal democracies”. Discourse, discussion and conversation are frequently mentioned by advocates of a classical liberal education. Clearly there are good reasons why it’s desirable for everyone to have “the background knowledge taken for granted by writers who address the intellectually engaged layman”. Whether all writers take the same background knowledge for granted, and who they consider to be an ‘intellectually engaged layman’ is another matter. This focus on the communication of ideas appears to originate in Michael Oakeshott’s reference to ‘the conversation of mankind’ (Peal, p. 209).

Earlier this week by chance I came across a televised seminar hosted by Nuffield College Oxford, on the results of the recent European elections. The seminar was the first of its kind, an experiment, and one in my view that’s well worth repeating. I learned more about European politics in two hours than I have in the past two years. One recurring theme in the discussion was the ‘rise of the meritocracy’, a term coined in the 1950s by Michael Young, Toby Young’s father. Vernon Bogdanor, whose former students include David Cameron and Toby Young, suggested that one of the reasons why UKIP and other anti-establishment parties were so successful in the recent election was because they were voted for by people who felt they’d been completely ignored by the meritocracy. The meritocracy are those who have benefited from higher education and whose decisions shape not only the knowledge that writers take for granted, but most people’s standard of living and quality of life.

Clearly, there are good reasons why everyone should be able to participate in the ‘conversation of mankind’. But human lives do not consist solely of engaging “fruitfully in conversation and debate – not just about contemporary issues, but also about the universal questions that have been troubling mankind throughout history”, as the West London Free School statement puts it. In order for some people to earn their living conversing and debating as philosophers, academics, politicians or writers, other people have to produce food, manufacture goods and maintain infrastructure. And they need to ensure that those things are done efficiently. It’s only through their doing so that the economy has enough surplus capacity to support philosophers, academics, politicians and writers, or indeed an education system.

enemies of promise

Before I’m dismissed as one of Michael Gove’s ‘Marxist enemies of promise’ I would point out that I’m not suggesting the people who grow food, manufacture goods or maintain an infrastructure don’t need to engage in important conversations and debates. Nor do I mean they don’t need a good education, or that education is only a preparation for getting a job. What I do mean is that those who do most of the conversing and debating should be well aware of what those involved in production, manufacturing and maintenance are up against.

The people who get their hands dirty, work in all weathers, use dangerous materials, and put their lives at risk on a daily basis are working at the interface between human society and the natural world. The natural world isn’t interested in having a conversation, it’s uncompromisingly and unforgivingly getting on with being the natural world. In order to work with it, we all – philosophers, academics, politicians and writers included – need to have a good grasp of how it functions. If we don’t, the conversation of mankind will be pretty limited.

a coherent curriculum

Broadly speaking, the content of the national curriculum whether informal (prior to 1988) or formal (since 1988) has been based on knowledge ‘trickling down’ from university subject areas. The content of undergraduate courses determines the content of A levels, which in turn informs GCSE content, which in turn informs what younger children are taught. The main problem with a subject-based curriculum is that isn’t integrated across subject areas. This has implications for students’ understanding of fundamental concepts that straddle several knowledge domains, and it’s this lack of understanding that I suspect has led to the recent emphasis in the national curriculum on knowledge-related ‘skills’ rather than on knowledge itself. I understand why there are calls for a return to a knowledge-based curriculum.

My concern is that framing the alternative in terms of participation in conversation and debate means that what we need to know in order to manage the sometimes nasty, sometimes messy business of maintaining a decent quality of life, will be marginalised. Using cultural references as a criterion means that the resulting curriculum might also lack coherence, since it won’t be based on the deep structure of knowledge, but on the references people make to specific items of knowledge, which isn’t the same thing. And if the curriculum isn’t coherent, that will impact on the sense it makes for students.

not making sense

For example, despite its lengthy explanation of classical liberal education, the West London Free School offers the national curriculum with Latin added, which to me doesn’t look like the same thing at all.

The Michaela Community School’s educational vision is expressed in the Matthew Arnold quote. The school claims to be inspired by Hirsch’s Core Knowledge Sequence and emphasises the importance of cross-curricular links. But it then claims Maths and English are ‘fundamental to all other learning’ (are they?) and tackles History and English, but not other subjects, chronologically. I could find little evidence of a coherent underlying rationale.

I hoped that The Curriculum Centre might shed some light on the matter with its Future Curriculum™, but no joy. The Curriculum Centre is also inspired by Hirsch (and Michael Young) and is critical of the national curriculum but remarkably coy, for a curriculum centre, about what it advocates instead.

Civitas in contrast, has done a lot of work on the cultural references curriculum. It has prepared a UK version of Hirsch’s Core Knowledge Sequence for Years 1-6. Although I can see why schools might have found the UK Core Knowledge Sequence useful, like Hirsch’s original it doesn’t seem especially coherent. For example, Year 1 History begins with the pre-history of Britain in that it covers the Ice Age, Stone Age, Bronze Age and Iron Age, which makes internal sense but completely overlooks the formation of the earth itself, the formation and break up of supercontinents and the migration of early humans, an excellent opportunity to promote an understanding of how physics, chemistry, biology, geography and history are related.

Then, bizarrely, we skip to ‘Kings and Queens’ and a list of disconnected ‘historic events’ from the Magna Carta to the Glorious Revolution, which it’s unlikely anyone, never mind children in Year 1, will be able to properly comprehend without knowing how those events emerged from events that preceded them. Even more bizarrely, we then skip to Prime Ministers (Robert Walpole is singled out for mention) and Symbols and Figures; the Union Jack, Buckingham Palace, 10 Downing Street and the Houses of Parliament.

It’s important that children know what these cultural references refer to, but there’s no reason why teachers shouldn’t just explain them briefly if they happen to be mentioned, rather than to include them, out of context, in the core curriculum. Not only is this piecemeal approach to curriculum design based on what one person or group of people consider to be important cultural references, but it’s also unlikely to make sense to children without the requisite pre-existing knowledge.

If, by contrast, we frame the curriculum in terms of students having a good understanding of how the world functions, from sub-atomic particles upwards, our educational framework will be much better integrated. And what needs to be included in the humanities part of the curriculum (if the curriculum must be divided in that way) will no longer be solely a matter of value judgments. It would mean that the criteria for deciding which periods of history are important for students to study, in what order and which books and plays and poems they focus on, and in what order, would be based on what would best help them understand the world they live in, rather than just understanding what “Times leader writers, heavyweight political commentators and authors of serious books(Young, p.34) have to say.

A chronological curriculum, such as the one I used with my own children (I refer to it in more detail here and here) is not only coherent, but it makes sense of everything. The only drawback is that if teachers are subject specialists, a bit of work might be required on integrating the curriculum across subject areas. The curriculum’s narrative spine will consist initially of physics, followed by chemistry, then biology, geology and geography – the humanities are relative latecomers in the earth’s history. That doesn’t mean children can’t learn to read until they’ve reached the point where writing was invented, or they can’t be taught geometry until they’ve covered the ancient Greeks. What it does mean that simultaneously studying the American Civil war in History, Shakespeare in English, the Renaissance in Art and polyphony in Music, alongside Linnaeus and Tim Burners-Lee (sic) in Science in Year 6 as the UK Core Knowledge Sequence advocates means that most of the contextual significance of all those things is lost.

It’s clear from Robert Peal’s role, his association with Civitas and his endorsement by Michael Gove, that Progressively Worse isn’t just expounding his personal opinion. Civitas claims “our research seeks out an objective view of standards of education in Britain”. If what Robert presents in his book is what Civitas or, more worryingly, the DfE consider to be an ‘objective view’ of education and that view is influencing educational policy in general and the development of a curriculum in particular, the quality of education in English schools in the next fifty years is unlikely to get progressively better.

folk categorisation and implicit assumptions

In his second response to critics, Robert [Peal] tackles the issue of the false dichotomy. He says;

…categorisation invariably simplifies. This can be seen in all walks of life: music genres; architectural styles; political labels. However, though imprecise, categories are vital in allowing discussion to take place. Those who protest over their skinny lattes that they are far too sophisticated to use such un-nuanced language … are more often than not just trying to shut down debate.

Categorisation does indeed simplify. And it does allow discussion to take place. Grouping together things that have features in common and labelling the groups means we can refer to large numbers of thing by their collective labels, rather than having to list all their common features every time we want to discuss them. Whether all categorisation is equally helpful is another matter.

folk categorisation

The human brain categorises things as if it that was what it was built for; not surprising really because grouping things according to their similarities and differences and referring to them by a label is a very effective way of reducing cognitive load.

The things we detect with our senses are categorised by our brains quickly, automatically and pre-verbally (e.g. Haxby, Gobbini & Montgomery, 2004; Greene & Fei-Fei, 2014) – by which I mean that language isn’t necessary in order to form the categories – although language is often involved in categorisation. We also categorise pre-verbally in the sense that babies start to categorise things visually (such as toy trucks and toy animals) at between 7 and 10 months of age, before they acquire language (Younger, 2003). And babies acquire language itself by forming categories.

Once we do start to get the hang of language, we learn about how things are categorised and labelled by the communities we live in; we develop shared ways of categorising things. All human communities have these shared ‘folk’ categorisations, but not all groups categorise the same things in the same way. Nettles and chickweed would have been categorised as vegetables in the middle ages, but to most modern suburban gardeners they are ‘weeds’.

Not all communities agree on the categorisations they use either; political and religious groups are notorious for disagreements about the core features of their categories, who adheres to them and who doesn’t. Nor are folk categorisations equally useful in all circumstances. Describing a politician’s views as ‘right wing’ gives us a rough idea of what her views are likely to be, but doesn’t tell us what she thinks about specific policies.

Biologists have run into problems with folk categorisations too.  Mushrooms/toadstools, frogs/toads and horses/ponies are all folk classifications. So although biologists could distinguish between species of mushrooms/toadstools,  grouping the species together as either mushrooms or toadstools was impossible, because the differences between the folk categories ‘mushrooms’ and ‘toadstools’ aren’t clear enough, so biologists neatly sidestepped the problem by ignoring the folk category distinctions and grouping mushrooms and toadstools together as a phylum. The same principle apples to frogs/toads – so they form an order of their own. Horses and ponies, by contrast, are members of the same subspecies.

Incidentally 18th and 19th century biologists weren’t categorising these organisms just because of an obsessive interest in taxonomy. Their classification had a very practical purpose – to differentiate between species and identify the relationships between them. In a Europe that was fast running out of natural resources, farmers, manufacturers and doctors all had a keen interest in the plants and animals being brought back from far-flung parts of the world by traders, and accurate identification of different species was vital.

In short, folk categories do allow discussion to take place, but they have limitations. They’re not so useful when one needs to get down to specifics – how are particular MPs likely to vote, or is this fungus toxic or not? The catch is in the two words Robert uses to describe categories – ‘though imprecise’. My complaint about his educational categorisation is not categorisation per se, but its imprecision.

‘though imprecise’

The categories people use for their own convenience don’t always have clear-cut boundaries, nor do they map neatly on to the real world. They don’t always map neatly onto other people’s categories either. Eleanor Rosch’s work on prototype theory shed some light on this. What she found was that people’s mental categories have prototypical features – features that the members of the category share – but not all members of the category have all the prototypical features, and category members can have prototypical features to different extents. For example, the prototypical features of most people’s category {birds} are a beak, wings, feathers and being able to fly. A robin has a beak, wings and feathers and is able to fly, so it’s strongly prototypical of the category {birds}. A penguin can’t fly but uses its wings for swimming, so it’s weakly prototypical, although still a bird.

Mushrooms and toadstools have several prototypical features in common, as do frogs and toads, horses and ponies. The prototypical features that differentiate mushrooms from toadstools, frogs from toads and horses from ponies are the ideas that; toadstools are poisonous and often brightly coloured; toads have a warty skin, sometimes containing toxins; and horses are much larger than ponies. Although these differential features are useful for conversational purposes, they are not helpful for more specific ones such as putting edible fungi on your restaurant menu, using a particular toxin for medicinal purposes or breeding characteristics in or out of horses.

traditional vs progressive education

Traditional and progressive education are both types of education, obviously, so they have some prototypical features in common – teachers, learners, knowledge, schools etc. Robert proposes some core features of progressive education that differentiate it from traditional education; it is child-centered, focuses on skills rather than knowledge, sees strict discipline and moral education as oppressive and assumes that socio-economic background dictates success (pp. 5-8). He distilled these features from what’s been said and written about progressive education over the last fifty years, so it’s likely there’s a high degree of consensus on these core themes. The same might not be true for traditional education. Robert defines it only in terms of its core characteristics being the polar opposite of progressive education, although he appears to include in the category ‘traditional’ a list of other more peripheral features including blazers, badges and ties and class rankings.

Robert says “though imprecise, categories are vital in allowing discussion to take place.” No doubt about that, but if the categories are imprecise the discussion can be distinctly unfruitful. A lot of time and energy can be expended trying to figure out precise definitions and how accurately those definitions map onto the real world. Nor are imprecise categories helpful if we want to do something with them other than have a discussion. Categorising education as ‘traditional’ or ‘progressive’ is fine for referring conversationally to a particular teacher’s pedagogical approach or the type of educational philosophy favoured by a government minister, but those constructs are too complex and too imprecise to be of use in research.

implicit assumptions

An implicit assumption is, by definition, an assumption that isn’t made explicit. Implicit assumptions are sneaky things because if they are used in a discussion, people following the argument often overlook the fact that an implicit assumption is being made. An implicit assumption that’s completely wrong can easily slip by unnoticed. Implicit assumptions get even more sneaky; often the people making the argument aren’t aware of their implicit assumptions either. In the case of mushrooms and toadstools, any biologists who tried to group certain types of fungi into one or other of these categories would be on a hiding to nothing because of an implicit, but wrong, assumption that the fungi could be sorted into one or other of these categories.

Robert’s thesis appears to rest on an implicit assumption that because the state education system in the last fifty years has had shortcomings, some of them serious, and because progressive educational ideas have proliferated during the same period, it follows that progressive ideas must be the cause of the lack of effectiveness. This isn’t even the ever-popular ‘correlation equals causality’ error, because as far as I can see, Robert hasn’t actually established a correlation between progressive ideas and educational effectiveness. He can’t compare current traditional and progressive state schools because traditional state schools are a thing of the past. And he can’t compare current progressive state schools with historical traditional state schools because the relevant data isn’t available. Ironically, what data we do have suggest that numeracy and literacy rates have improved overall during this period. The reliability of the figures is questionable because of grade drift, but numeracy and literacy rates have clearly not plummeted.

What he does implicitly compare is state schools that he sees as broadly progressive, with independent schools that he sees as having “withstood the wilder extremes of the [progressive] movement”. The obvious problem with this comparison is that a progressive educational philosophy is not the only difference between the state and independent sectors.

In my previous post, I agreed with Robert that the education system in England leaves much to be desired, but making an implicit assumption that there’s only one cause and that other possible causes can be ignored is a risky approach to policy development. It would be instructive to compare schools that are effective (however you measure effectiveness) with schools that are less effective, to find out how the latter could be improved. But the differences between them could boil down to some very specific issues relating to the quality of teaching, classroom management, availability of additional support or allocation of budgets, rather than whether the schools take a ‘traditional’ or ‘progressive’ stance overall.

References
Greene, MR & Fie-Fie, L (2014).Visual categorization is automatic and obligatory: Evidence from Stroop-like paradigm. Journal of Vision, 14, article 14.
Haxby, J.V., Gobbini, M. I. & Montgomery, K. (2004). Spatial and temporal distribution of face and object representations in the human brain. In M. S. Gazzaniga (Ed.) The Cognitive Neurosciences (3rd edn.). Cambridge, MA: MIT Press.
Kuhl, P. (2004). Early language acquisition:Cracking the speech code. Nature Reviews Neuroscience 5, 831-843.
Younger, B (2003). Parsing objects into categories: Infants’ perception and use of correlated attributes. In Rakison & Oakes (eds.) Early Category and Concept development: Making sense of the blooming, buzzing confusion, Oxford University Press.

no comparison: Progressively Worse

My children’s (relatively recent) experience of the education system was at times perplexing. The curriculum didn’t seem systematic, rigorous or engaging – a bad combination. Teachers didn’t seem to understand why they did what they did – the younger ones, anyway. The older ones rolled their eyes and told me how long they had to go before retirement. ‘Zero-tolerance’ of poor behaviour amounted to stringent sanctions for having the ‘wrong’ hairstyles, but no action on low-level disruption in the classroom. High aspirations took the form of a big push to get borderline children over the ‘average’ threshold for SATs, but left the gifted and talented bored and those with SEN floundering. Did I attribute these phenomena to progressive education? No. I attributed them to a fragmented curriculum, inadequate teacher training, poor behaviour management and a lack of understanding on the part of central government about how systems work, all of which are possible whether progressive or traditional teaching methods are being deployed. In fact the most perplexing school my kids attended didn’t look in the least progressive. The curriculum was inflexible, the teachers were inflexible, there were a lot of rewards and sanctions and an intense focus on test results.

educational reform

Then I started to hear talk of reforming the curriculum, reforming teacher training, giving teachers more professional freedom, improving behaviour to allow teachers to teach, getting rid of the target culture, and of an evidence-based education system. My hopes were raised, but not for long; what we appear to be heading towards instead is a differently-fragmented curriculum, little or no teacher training, shifting the blame for poor behaviour onto parents, changing the targets and an interesting approach to using evidence. It’s the evidence bit that’s really got to me, which is why I’ve been critical of the ‘new traditionalists’ rather than the education system they too are complaining about.

traditional vs progressive

In Progressively Worse Robert Peal predicted that educational commentators would accuse him of a ‘polarising rhetoric’ that establishes ‘false dichotomies’ (p.8). I’m one of them. In his second response to his critics, he tackles the issue of the false dichotomy.

Robert says “A false dichotomy is an either/or choice where some middle ground is actually possible. At no point in Progressively Worse do I offer an either/or choice between progressive and traditional education.” Well, that’s one definition. A false dichotomy can also be something presented as a dichotomy when other options are available – two categories might not be enough. How people form categories is worth exploring in more depth, but in this post I want to ask what progressive education or progressive schools are being compared to.

In his introduction to Progressively Worse Robert identifies four core themes that he says characterise progressive education. It is child-centered, focuses on skills rather than knowledge, sees strict discipline and moral education as oppressive and assumes that socio-economic background dictates success (pp. 5-8). The implication is that traditional education is characterised by the opposites. But Robert doesn’t see progressive and traditional education as either/or choices with no middle ground. He says;

Such dichotomies (skills/knowledge, child-centred/teacher-led) are perhaps better thought of as sitting at opposite ends of a spectrum. If we are to decide what constitutes a sensible position on each spectrum, we need to appreciate better how far British schools currently gravitate towards the progressive ends. Whilst a wholesale move towards traditionalist modes of education would be harmful, a corrective shift in that direction is desperately needed.” (p.8)

Although this sounds plausible, there’s a problem inherent in this model. Let’s assume that there’s general agreement that Robert’s four core themes do indeed characterise a construct we call ‘progressive education’. Let’s also assume that each of these four themes has been operationalised – we’ve identified what features of a school indicate where they lie on the sliding scale for each of the core themes. Some schools are going to rate high for progressive on each spectrum, or low for progressive on each. Others are going to be somewhere in the middle. But it would still be possible for a particular school to be, say, teacher-led, but focus on skills rather than knowledge, and to have strict discipline but also believe that socio-economic background dictates success – in short, to be strongly progressive on two of the sliding scales but strongly traditional on the other two.

Such a school wouldn’t occupy a ‘sensible position on each spectrum’, but extreme, opposing positions on the different spectra, making it impossible to determine whether the school as a whole could be described as progressive or traditional. And if we can’t decide whether a school is progressive or traditional, it makes it difficult to compare the performance of different types of school – the idea at the heart of Robert’s thesis.

no comparison

Let’s assume we’ve overcome those methodological hurdles and we’ve found a group of schools that are indisputably ‘progressive’. What do we compare them to? In his response to accusations of cherry-picking, Robert says

“I warrant that any historian writing a counter-narrative to Progressively Worse would have a difficult time finding any cherries worth picking. No seminal government document of the period exists which was as traditionalist as Plowden was progressive.”

The overwhelming impression one gets from Robert’s book is that the march of progressivism between 1960 and 2010 was so relentless that there are no ‘traditional’ state schools left, so a comparison in terms of how progressive/traditional specific schools are and the effectiveness of their educational methods, can’t be made.

How about comparing current progressive state schools to pre-war ones that were more likely to be traditional? When I asked Robert about this in a comment on his post, he agreed that suitable data weren’t available. We don’t have comparable data on numeracy and literacy, for example, prior to 1948.

The only schools left with which a comparison could be made are those within the independent sector; in his book Robert describes them as being largely “immune to the winds of educational change” and concludes that “they have withstood the wilder extremes of the [progressive] movement.” The problem with making a comparison with independent schools is of course that there are confounding factors involved, such as selection, socio-economic background, parental educational attainment and educational support at home. A comparison wouldn’t be impossible, but it would be a major challenge and because of the confounding factors, the results wouldn’t be robust.

Robert concludes in response to my question about comparisons;

“Anyway, I think you have misunderstood the title, and therefore argument, of Progressively Worse. I am not suggesting that everything was hunky dory until 1965, and schools got ‘progressively worse’. As I write in the introduction, ‘This book is not a call to return to some distant glory, and the world of blackboards, canes and the 11+ is not the future that it proposes.’ What I do argue is that schools which embrace the principles of progressive education are worse. So far as it exists, the historical evidence for this case is compelling.”

He still doesn’t say what progressive schools are worse than. His perception of them as ‘worse’ doesn’t appear to be derived from an evidence-based comparison between real schools, but on historical evidence that shows that some progressive schools had to be closed because they were so awful, and that some other progressive schools have low GCSE results. Those are bad things, to be sure, but unless we have comparable data on the closure of traditional state schools or their exam results, we’re not actually making a comparison.

At one level, I have some sympathy for new traditionalists like Robert; I’d like to see a coherent curriculum, more pedagogical rigour, more freedom for teachers to teach and better behaviour in schools. At another level, I’m nonplussed by why he identifies progressive ideas as the main cause of the education system’s shortcomings, and what he presents as ‘evidence’ supporting the need to replace progressive education with … what exactly? Robert doesn’t say, but it’s difficult to avoid the impression that he thinks state schools modelling themselves on independent schools might be the way forward. I agree that the education system in England leaves a good deal to be desired, but that could be due to a badly designed curriculum, inadequate teacher training, poor behaviour management and a lack of government understanding of how systems function, rather than progressive ideas. Modelling state schools on independent schools could still fail to address all of those issues. My concern is that if the evidence being used to justify such a change is derived from poorly defined constructs that aren’t operationalised, the absence of data, and no attempt to eliminate bias, we will simply be spending a lot of money replacing one opinion-based education system with another. We’ve been doing that since 1944 and look where it’s got us. I’m still perplexed.

the venomous data bore

Robert Peal has posted a series of responses to critics of his book Progressively Worse here. The second is on ‘data and dichotomies’. In this post I want to comment on some of the things he says about data and evidence.

when ‘evidence doesn’t work’

Robert* refers back to a previous post entitled ‘When evidence doesn’t work’ summarising several sessions at the ResearchED conference held at Dulwich College last year. He rightly draws attention to the problem of hard-to-measure outcomes, and to which outcomes we decide to measure in the first place. But he appears to conclude that there are some things – ideology, morality, values – that are self-evidently good or bad and that are outside the remit of evidence.

In his response to critics, Robert claims that one reason ‘evidence doesn’t work’ is because “some of the key debates in education are based on value judgements, not efficacy.” This is certainly true – and those key debates have resulted in a massive waste of resources in education over the past 140 years. There’s been little consensus on what long-term outcomes people want from the education system, what short-term outcomes they want, what pedagogies are effective and how effectiveness can be assessed. If a decision as to whether Shakespeare ‘should’ be studied at GCSE is based on value judgements it’s hardly surprising it’s been the subject of heated debate for decades. Robert’s conclusion appears to be that heated debate about value judgements is inevitable because values aren’t things that lend themselves to being treated as evidence. I disagree.

data

I think he draws this conclusion because his view of data is rather limited. Data don’t just consist of ‘things we can easily measure’ like exam results (Robert’s second reason why ‘evidence doesn’t work’). They don’t have to involve measuring things at all; qualitative data can be very informative. Let’s take the benefits of studying Shakespeare in school. Robert asks “Can an RTC tell us, for example, whether secondary school pupils benefit from studying Shakespeare?” If it was carefully controlled it could, though we would have to tackle the question of what outcomes to measure. But randomised controlled trials are only one of many methods for gathering data. Collecting qualitative data from a representative sample of the population about the impact studying Shakespeare had had on their lives could give some insights, not only into whether Shakespeare should be studied in school, but how his work should be studied. And whether people should have the opportunity to undertake some formal study of Shakespeare in later life if they wanted to. People might appreciate actually being asked.

venomous data bore*

venomous data bore Buprestis octoguttata§

opinion

I don’t know whether Robert sees me as what he refers to as a ‘data bore’, but if he does I accept the epithet as a badge of honour. For the record however, not only have I never let a skinny latte pass my lips, but the word ‘nuanced’ has never done so either (not in public, at least). Nor do I have a “lofty distain for anything so naïve as ‘having an opinion’”.

I’m more than happy for people to have opinions and to express them and for them to be taken into account when education policy is being devised. But not all opinions are equal. They can vary between professional, expert opinion derived from a thorough theoretical knowledge and familiarity with a particular research literature, through well-informed personal opinion, to someone simply liking or not liking something but not having a clue why. I would not want to receive medical treatment based on a vox pop carried out in my doctor’s waiting room, nor do I want a public sector service to be designed on a similar basis. If it is, then the people who voice their opinions most loudly are likely to get what they want, leaving the rest of us, ‘data bores’ included, to work on the damage limitation.

rationality and values

Robert appears to have a deep suspicion of rationality. He says “rational man believes that they can make their way in the world without recourse to the murky business of ideology and morality, or to use a more contemporary term, ‘values’.” He also says it was ‘terrific’ to hear Sam Freedman expound the findings of Jonathan Haidt and Daniel Kahnemann “about the dominance of the subconscious, emotional part of our minds, over the logical, conscious part.” He could add Antonio Damasio to that list. There’s little doubt that our judgement and decision-making is dominated by the subconscious emotional part of our minds. That doesn’t mean it’s a good thing.

Ideology, morality and values can inspire people to do great things, and rationality can inflict appalling damage, but it’s not always like that. Every significant step that’s been ever been taken towards reducing infant mortality, maternal mortality, disease, famine, poverty and conflict and every technological advance ever made has involved people using the ‘logical conscious part’ of their minds as well as, or instead of, the ‘subconscious emotional part’. Those steps have sometimes involved a lifetime’s painstaking work in the teeth of bitter opposition. In contrast, many of the victims of ideology, morality and values lie buried where they fell on the world’s battlefields.

Robert’s last point about data is that they are “simply not able to ‘speak for themselves’. Its voice is always mediated by human judgement.” That’s not quite the impression given on page 4 of his book when referring to a list of statistics he felt showed there was a fundamental problem in British education. In the case of these statistics, ‘the bare figures are hard to ignore’.

Robert is quite right that the voice of the data is always mediated by human judgement, but we have devised ways of interpreting the data that make them less susceptible to bias. The data are perfectly capable of speaking for themselves, if we know how to listen to them. Clearly the researcher, like the historian, suffers from selection bias, but some fields of discourse, unlike history it seems, have developed robust methodologies to address that. The biggest problem faced by the data is that they can’t get a word in edgeways because of all the opinion being voiced.

endnote

According to this tweet from Civitas…

civitas venom

Robert says he has responded to criticism in blogs by Tim Taylor, Guy Woolnough and myself. I’m doubtless biased, but the comment most closely resembling ‘venom’ that I could find was actually in a scurrilous tweet from Debra Kidd, shown in Robert’s third response to his critics. Debra, shockingly for a teacher, uses a four-letter-word to describe Robert’s description of state schools as ‘a persistent source of national embarrassment’. She calls it ‘tosh’. If Civitas thinks that’s venom, it clearly has little experience of academia, politics or the playground. Rather worrying on all counts, if it’s a think tank playing a significant role in education reform.

* I felt we should be on first name terms now we’ve had a one-to-one conversation about statistics.

§ Image courtesy Christian Fischer from Britannica Kids.

It’s not really a venomous data bore, it’s a Metallic wood-boring beetle. It’s not really metallic either, it just looks like it. Nor does the beetle bore wood, its larvae do. Words can be so misleading.

the history of reading methods revisited (4)

And here’s Maggie’s response to my comments, which are in italics.

On reflection, I think I could have signposted the key points I wanted to make more clearly in my post. My reasoning went like this;
1. Until the post-war period reading methods in the UK were dominated by alphabetic/phonics approaches.
2. Despite this, a significant proportion of children didn’t learn to read properly.
3. Current concerns about literacy levels don’t have a clear benchmark – what literacy levels do we expect and why?
4. Although literacy levels have fallen in recent years, the contribution of ‘mixed methods’ to this fall is unclear; other factors are involved.
A few comments on Maggie’s post:
Huey and reading methods
My observation about the use of alphabetic and analytic phonics approaches in the early days of state education in England is based on a fair number of accounts I’ve either heard or read from people who were taught to read in the late 19th/early 20th century. Without exception, they have reported;
• learning the alphabet
• learning letter-sound correspondences
• sounding out unfamiliar words letter-sound by letter-sound

This accords with the account I proposed, that phonics methods persisted in the UK for the early decades of 20th C. I’d also note, as I have on the RRF board, that my account was something of a gallop through the topic. It was bound to be broad brushed rather than detailed. Of course a variety of practices will have obtained at any period (as they do now) but I was trying to indicate what appeared to be the ‘dominant’ practice at any one time.

I’m well aware that that the first-hand accounts I’ve come across don’t form a representative sample, but from what Maggie has distilled from Huey, the accounts don’t appear to be far off the mark for what was happening generally. I concede that sounding out unfamiliar words doesn’t qualify as ‘analytic phonics’, but it’s analytic something – analytic letter-sound correspondence, perhaps?

Modern definitions of ‘analytic’ phonics make it clear that children are taught whole words initially and the words are then ‘analysed for their phonic structure. This may not necessarily be at the level of the phoneme; analytic phonics may also include analysis at the syllable level and at ‘onset/rime’ level (the familiar ‘word families’). This practice would seem to be more allied to the Word method (recall that Huey said that phonics could be taught once children had learned to read) than to the ‘Alphabetic’ method. Though, to be honest, it is very difficult to work out from contemporary primers and accounts of instructing/learning reading just how the Alphabetic method was taught. When accounts speak of ‘learning letters’ are letter names being taught or sound values? When they talk of ‘spelling’ words are they referring to actually writing words or to saying letter names followed by the whole word (see ai tee . cat) or to orally sounding out and blending? Certainly reading primers such as ‘Reading Without Tears’ first published 183?* are arranged in much the same way as a modern ‘decodable’ book.

However, if the Phonic method which Huey describes is anything like the method Rebecca Pollard outlines (‘Manual of Synthetic Reading and Spelling’(1897)) it is closely akin to the supposedly ‘new’ SP method in that it taught letter/sound correspondences, decoding and blending, from simple to complex, as did the method outlined by Nellie Dale (‘On the Teaching of English Reading’. 1898).

Montessori
I cited Montessori as an example of the Europe-wide challenge posed by children who struggled at school; I wasn’t referring to her approach to teaching reading specifically. In her book she frequently mentions Itard and Séguin who worked with hearing-impaired children. She applies a number of their techniques, but doesn’t appear to agree with them about everything – she questions Séguin’s approach to writing, for example.

In which case I misunderstood your reason for citing her. I thought it was specifically in relation to teaching reading. Her sections on teaching reading and writing are very interesting. What is striking is that she believed in the ‘developmental’ model, agreeing with Huey’s contention that children should not be taught to read before they were at least 6. She describes how she tried very hard to resist younger children’s appeals to be taught to read and write but found that after motor skills training with letter shapes some of them were self teaching anyway and delighted with their achievements!

Frank Smith
I haven’t read Smith, but the fact that skilled readers use context and prediction to read the words on the page wasn’t his ‘proposal’. By the 1970s it was a well-documented feature of contextual priming in skilled readers, i.e. skilled adult readers with large spoken vocabularies. From what Maggie has said, the error Smith appears to have made is to assume that children could learn by mimicking the behaviour of experts – a mistake that litters the history of pedagogy.

Indeed, he was echoing much earlier theorists, such as Huey, in this belief and, of course, by the time he was writing many readers may have been using such strategies because of being taught by Word methods (I’m sticking to my hypothesis!). I can’t find that he has any evidence for his assertion and, as I pointed out, Stanovich and West disproved his theory.

Hinshelwood and Orton
Hinshelwood was a British ophthalmologist interested in reading difficulties caused by brain damage. Orton was American, but was a doctor also interested in brain damage and its effect on reading. I can’t see how the work of either of them would have been affected by the use of Whole Word reading methods in US schools, although their work has frequently been referred to as an explanation for reading difficulties.

Orton’s interest famously ultimately extended beyond brain damaged subjects to the study of non-brain damaged subjects with ‘dyslexia’. At the time he was working Word methods were predominant in US schools and he implicated these methods as contributing to his subject’s problems. The Orton-Gillingham structured, systematic phonics programme was developed for helping these dyslexics. It appears to have been innovatory for its period and, believe it or not, from online contacts with US practitioners I understand that because it is SSP it is still fairly contentious in the US today! They express the same frustrations as do SP proponents. If only children were taught the OG way there wouldn’t be so much reading failure in the US!

I am not familiar with Hinshelwood but it’s clear that I shall have to look him up!

the rejection of the alphabetic principle
Maggie says my statement that the alphabetic principle and analytic phonics had been abandoned because they hadn’t been effective for all children ‘makes no sense at all’. If I’m wrong, why were these methods abandoned?

I still don’t think it makes any sense. For a start, you give no time scale. When did this abandonment take place? And you are conflating Alphabetic with Analytic which I don’t think is correct (see my earlier comment).

Another point is that you are crediting educationists and teachers with a degree of rationality which I don’t think is justified. The widespread acceptance of the Word method, which had no evidence to back it but strong appeals to ‘emotion’ with the language of its denigration of Phonic methods, is a case in point. Boring, laborious, ‘drill & kill’, barren, mechanical, uncomprehending, the list is long (and very familiar). It is a technique promoted today as ‘framing’ (though I might acquit its original users of deliberate use of it). Very easy to be persuaded by the language without really considering the validity of the method it purports to describe.

And, of course, there was the lure of modernity. Word methods were advocated by modern educationists as part of progressive educational methods (but let’s not get into an argument about ‘progressive  ). I don’t know how much teachers believed that there was some sort of research base for progressive methods but as Huey sets some store by research (pages and pages on eye movements, for example) and does have an evidence base for some of what he says I would suggest that it would be taken on trust that it was all evidence based. I would also suggest that the discourse of ‘science’, ‘research’, ‘progressive’ would be enough to convince many without them delving too deeply into the evidence. Brain Gym, anybody?

In addition, though my suggestion that ‘official’ advice was followed has been questioned, it might be noted that in respect of the post WW2 UK both the government committee of 1947 and the Bullock Report (1975) both firmly endorsed a mixed methods approach which started from Whole Word and taught phonics if necessary.

It is also interesting that Bullock notes that increasing numbers of children, particularly ‘working class’ children, were entering Junior school (Y2) unable to read. Might one ascribe this to developmentalist theory?

using a range of cues
The cues I listed are those identified in skilled adult readers in studies carried out predominantly in the post-war period. Maggie’s hypothesis is that the range of cues is an outcome of the way the participants in experiments (often college students) had been taught to read. It’s an interesting hypothesis; it would be great to test it.

I stand by it! I have worked with too many children who read exactly as taught by the Searchlights!

I thought I would revisit these ‘cues’ which are supposed to have offered sufficient exposure to auditory and visual patterns to develop automated, fast recognition. They are ‘recognising words by their shape, using key letters, grammar, context and pictures,’

recognising words by their shape, Confounded at once by the fact that many words have the same shape: sack, sick, sock, suck, lack, lick, luck, lock, pock, pick, puck, pack,

using key letters, Would those be the ones that differentiate each word in the above word list?

grammar, Well, I can see how you might ‘predict’ a particular grammatical word form, noun, verb, adjective etc. but the specific word? By what repeated pattern would you develop automatic recognition of it?

context I think the same might apply as for grammar. You need a mechanism for recognising the actual word.

pictures, Hm. Very useful for words like oxygen, air, the, gritty, bang, etc.

An alternative hypothesis is that the strategies used by skilled adult readers are an outcome of how brains work. Prior information primes neural networks and thus reduces response time, and frequent exposure to auditory and visual patterns such as spoken and written words results in automated, fast recognition.

In view of Stanovich & West’s findings I would be interested to see any studies which show that skilled adult readers did use the ‘cues’ you listed. (as above)

I know we have had discussions about the term ‘natural’ but ultimately reading is a taught skill. If readers use strategies which can be directly related to the strategies they were taught I cannot see that why they should be ascribed to untaught and unconscious exploitation of the brain’s capabilities. I could only accept this hypothsis in the case of self taught readers. I would be surprised to find the generality of beginning readers developing such strategies spontaneously (i.e. undirected/taught) when presented with text, though some outliers might. What would you do if presented with a page of unfamiliar script, Hebrew, Arabic, Thai, Chinese and told to read it without any help whatsoever? And you are 5ys old.

For example, in chapter 2 of Stanovich’s book, West and Stanovich report fluent readers’ performance being facilitated by two automated processes; sentence context (essentially semantic priming) and word recognition.

I appreciate that but this is described as a feature of fluent, skilled reading. To assume that beginning readers do this spontaneously might be to fall into the same trap as assuming’ that children could learn by mimicking the behaviour of experts’

According to chapter 3, fluent readers use phonological recoding if automated word recognition fails.

Isn’t that the whole point. Fluent readers didn’t use context, or other ‘cues’, to identify unfamiliar words, they used phonological recoding.

It is also moot that they use context to predict upcoming words (although I do understand about priming effects). There is also the possibility that rapid, automatic and unconscious decoding is the mechanism of automatic word recognition (Dehaene). Possibly with context confirming that the word is correct? A reading sequence of ‘predicting’, then, presumably, checking for correctness of form and meaning (how? by decoding and blending?) seems like a strange use of processing when decoding gets the form of the word correctly straight away and immediately activates meaning.

educators’ reasoning
I wasn’t saying that the educators’ assessment of alphabetic/phonics methods was right, just that it was what they claimed. Again, if they didn’t think that, why would alphabetic/phonics methods have been abandoned?

Se above!

falling literacy standards
The data that I suggested weren’t available would enable us to make a valid comparison between the literacy levels of school-leavers (aged 13, say) at the beginning of the 20th century when alphabetic/phonics methods were widely used in the UK, and current levels for young people of the same age. The findings Maggie has cited are interesting, but don’t give us a benchmark for the literacy levels we should expect.

There is some post WW2 data in the Bullock report though it is held to be not totally reliable. However, it finds that ‘reading standards’ rose from 1948 to 1961 but then fell back slightly from 1961 to 1971. Make of that what you will!

national curriculum and standardised testing
The point I was trying to make was not about the impact of the NC and SATs on reading, but that the NC and SATs made poor readers more obvious. In the reading-ready era, some children not reading at 7 would have learned to read by the time they were 11, but that delay wouldn’t have appeared in national statistics.

As, indeed, it appeared to be doing so in Bullock (see above)

reading for enjoyment
Children leaving school without functional literacy is certainly a cause for concern, and I agree that methods of teaching reading must be implicated. But technological changes since 1990 haven’t helped. The world of young people is not as text-based as it used to be, and not as text-based as the adult world. That issue needs to be addressed.

Which, as you might guess, I would partially ascribe to adoption of Whole Word, Whole Language & Mixed Methods. I have watched the ‘simplification’ of text over my lifetime in the cause of ‘including’ the semi-literate.

I think there’s a political element too, in the rejection of ‘elite’ language (aka ‘big words’). I shall have to dig out my copy of ‘The Uses of Literacy’ I think, to see what literacy expectations there were of the 50’s generation. Could be instructive.

What I do find interesting, and perhaps pertinent to the question of ‘dumbing down’ being discussed in other twitter conversations, is that, although we don’t really know what percentage of the population were literate in the latter half of the 19th C and the early 20th C, popular texts and the media appear to have expected a far more complex vocabulary knowledge, and an ability to comprehend far more complex syntax, of those who could read, even of children.. Compare, for example, Beatrix Potter with ORT.

Note:
Huey, Dewie & Louie are the names of Donald Duck’s three nephews
There’s no Louie in this story yet.

Perhaps Walt was taught the rhetorical ‘rule of three’!

It’s sad that we don’t have a Louie (or a Lewie) to complete the triumvirate. They would trip so nicely off the tongue..