And here’s Maggie’s response to my comments, which are in italics.
On reflection, I think I could have signposted the key points I wanted to make more clearly in my post. My reasoning went like this;
1. Until the post-war period reading methods in the UK were dominated by alphabetic/phonics approaches.
2. Despite this, a significant proportion of children didn’t learn to read properly.
3. Current concerns about literacy levels don’t have a clear benchmark – what literacy levels do we expect and why?
4. Although literacy levels have fallen in recent years, the contribution of ‘mixed methods’ to this fall is unclear; other factors are involved.
A few comments on Maggie’s post:
Huey and reading methods
My observation about the use of alphabetic and analytic phonics approaches in the early days of state education in England is based on a fair number of accounts I’ve either heard or read from people who were taught to read in the late 19th/early 20th century. Without exception, they have reported;
• learning the alphabet
• learning letter-sound correspondences
• sounding out unfamiliar words letter-sound by letter-sound
This accords with the account I proposed, that phonics methods persisted in the UK for the early decades of 20th C. I’d also note, as I have on the RRF board, that my account was something of a gallop through the topic. It was bound to be broad brushed rather than detailed. Of course a variety of practices will have obtained at any period (as they do now) but I was trying to indicate what appeared to be the ‘dominant’ practice at any one time.
I’m well aware that that the first-hand accounts I’ve come across don’t form a representative sample, but from what Maggie has distilled from Huey, the accounts don’t appear to be far off the mark for what was happening generally. I concede that sounding out unfamiliar words doesn’t qualify as ‘analytic phonics’, but it’s analytic something – analytic letter-sound correspondence, perhaps?
Modern definitions of ‘analytic’ phonics make it clear that children are taught whole words initially and the words are then ‘analysed for their phonic structure. This may not necessarily be at the level of the phoneme; analytic phonics may also include analysis at the syllable level and at ‘onset/rime’ level (the familiar ‘word families’). This practice would seem to be more allied to the Word method (recall that Huey said that phonics could be taught once children had learned to read) than to the ‘Alphabetic’ method. Though, to be honest, it is very difficult to work out from contemporary primers and accounts of instructing/learning reading just how the Alphabetic method was taught. When accounts speak of ‘learning letters’ are letter names being taught or sound values? When they talk of ‘spelling’ words are they referring to actually writing words or to saying letter names followed by the whole word (see ai tee . cat) or to orally sounding out and blending? Certainly reading primers such as ‘Reading Without Tears’ first published 183?* are arranged in much the same way as a modern ‘decodable’ book.
However, if the Phonic method which Huey describes is anything like the method Rebecca Pollard outlines (‘Manual of Synthetic Reading and Spelling’(1897)) it is closely akin to the supposedly ‘new’ SP method in that it taught letter/sound correspondences, decoding and blending, from simple to complex, as did the method outlined by Nellie Dale (‘On the Teaching of English Reading’. 1898).
I cited Montessori as an example of the Europe-wide challenge posed by children who struggled at school; I wasn’t referring to her approach to teaching reading specifically. In her book she frequently mentions Itard and Séguin who worked with hearing-impaired children. She applies a number of their techniques, but doesn’t appear to agree with them about everything – she questions Séguin’s approach to writing, for example.
In which case I misunderstood your reason for citing her. I thought it was specifically in relation to teaching reading. Her sections on teaching reading and writing are very interesting. What is striking is that she believed in the ‘developmental’ model, agreeing with Huey’s contention that children should not be taught to read before they were at least 6. She describes how she tried very hard to resist younger children’s appeals to be taught to read and write but found that after motor skills training with letter shapes some of them were self teaching anyway and delighted with their achievements!
I haven’t read Smith, but the fact that skilled readers use context and prediction to read the words on the page wasn’t his ‘proposal’. By the 1970s it was a well-documented feature of contextual priming in skilled readers, i.e. skilled adult readers with large spoken vocabularies. From what Maggie has said, the error Smith appears to have made is to assume that children could learn by mimicking the behaviour of experts – a mistake that litters the history of pedagogy.
Indeed, he was echoing much earlier theorists, such as Huey, in this belief and, of course, by the time he was writing many readers may have been using such strategies because of being taught by Word methods (I’m sticking to my hypothesis!). I can’t find that he has any evidence for his assertion and, as I pointed out, Stanovich and West disproved his theory.
Hinshelwood and Orton
Hinshelwood was a British ophthalmologist interested in reading difficulties caused by brain damage. Orton was American, but was a doctor also interested in brain damage and its effect on reading. I can’t see how the work of either of them would have been affected by the use of Whole Word reading methods in US schools, although their work has frequently been referred to as an explanation for reading difficulties.
Orton’s interest famously ultimately extended beyond brain damaged subjects to the study of non-brain damaged subjects with ‘dyslexia’. At the time he was working Word methods were predominant in US schools and he implicated these methods as contributing to his subject’s problems. The Orton-Gillingham structured, systematic phonics programme was developed for helping these dyslexics. It appears to have been innovatory for its period and, believe it or not, from online contacts with US practitioners I understand that because it is SSP it is still fairly contentious in the US today! They express the same frustrations as do SP proponents. If only children were taught the OG way there wouldn’t be so much reading failure in the US!
I am not familiar with Hinshelwood but it’s clear that I shall have to look him up!
the rejection of the alphabetic principle
Maggie says my statement that the alphabetic principle and analytic phonics had been abandoned because they hadn’t been effective for all children ‘makes no sense at all’. If I’m wrong, why were these methods abandoned?
I still don’t think it makes any sense. For a start, you give no time scale. When did this abandonment take place? And you are conflating Alphabetic with Analytic which I don’t think is correct (see my earlier comment).
Another point is that you are crediting educationists and teachers with a degree of rationality which I don’t think is justified. The widespread acceptance of the Word method, which had no evidence to back it but strong appeals to ‘emotion’ with the language of its denigration of Phonic methods, is a case in point. Boring, laborious, ‘drill & kill’, barren, mechanical, uncomprehending, the list is long (and very familiar). It is a technique promoted today as ‘framing’ (though I might acquit its original users of deliberate use of it). Very easy to be persuaded by the language without really considering the validity of the method it purports to describe.
And, of course, there was the lure of modernity. Word methods were advocated by modern educationists as part of progressive educational methods (but let’s not get into an argument about ‘progressive ￼ ). I don’t know how much teachers believed that there was some sort of research base for progressive methods but as Huey sets some store by research (pages and pages on eye movements, for example) and does have an evidence base for some of what he says I would suggest that it would be taken on trust that it was all evidence based. I would also suggest that the discourse of ‘science’, ‘research’, ‘progressive’ would be enough to convince many without them delving too deeply into the evidence. Brain Gym, anybody?
In addition, though my suggestion that ‘official’ advice was followed has been questioned, it might be noted that in respect of the post WW2 UK both the government committee of 1947 and the Bullock Report (1975) both firmly endorsed a mixed methods approach which started from Whole Word and taught phonics if necessary.
It is also interesting that Bullock notes that increasing numbers of children, particularly ‘working class’ children, were entering Junior school (Y2) unable to read. Might one ascribe this to developmentalist theory?
using a range of cues
The cues I listed are those identified in skilled adult readers in studies carried out predominantly in the post-war period. Maggie’s hypothesis is that the range of cues is an outcome of the way the participants in experiments (often college students) had been taught to read. It’s an interesting hypothesis; it would be great to test it.
I stand by it! I have worked with too many children who read exactly as taught by the Searchlights!
I thought I would revisit these ‘cues’ which are supposed to have offered sufficient exposure to auditory and visual patterns to develop automated, fast recognition. They are ‘recognising words by their shape, using key letters, grammar, context and pictures,’
recognising words by their shape, Confounded at once by the fact that many words have the same shape: sack, sick, sock, suck, lack, lick, luck, lock, pock, pick, puck, pack,
using key letters, Would those be the ones that differentiate each word in the above word list?
grammar, Well, I can see how you might ‘predict’ a particular grammatical word form, noun, verb, adjective etc. but the specific word? By what repeated pattern would you develop automatic recognition of it?
context I think the same might apply as for grammar. You need a mechanism for recognising the actual word.
pictures, Hm. Very useful for words like oxygen, air, the, gritty, bang, etc.
An alternative hypothesis is that the strategies used by skilled adult readers are an outcome of how brains work. Prior information primes neural networks and thus reduces response time, and frequent exposure to auditory and visual patterns such as spoken and written words results in automated, fast recognition.
In view of Stanovich & West’s findings I would be interested to see any studies which show that skilled adult readers did use the ‘cues’ you listed. (as above)
I know we have had discussions about the term ‘natural’ but ultimately reading is a taught skill. If readers use strategies which can be directly related to the strategies they were taught I cannot see that why they should be ascribed to untaught and unconscious exploitation of the brain’s capabilities. I could only accept this hypothsis in the case of self taught readers. I would be surprised to find the generality of beginning readers developing such strategies spontaneously (i.e. undirected/taught) when presented with text, though some outliers might. What would you do if presented with a page of unfamiliar script, Hebrew, Arabic, Thai, Chinese and told to read it without any help whatsoever? And you are 5ys old.
For example, in chapter 2 of Stanovich’s book, West and Stanovich report fluent readers’ performance being facilitated by two automated processes; sentence context (essentially semantic priming) and word recognition.
I appreciate that but this is described as a feature of fluent, skilled reading. To assume that beginning readers do this spontaneously might be to fall into the same trap as assuming’ that children could learn by mimicking the behaviour of experts’
According to chapter 3, fluent readers use phonological recoding if automated word recognition fails.
Isn’t that the whole point. Fluent readers didn’t use context, or other ‘cues’, to identify unfamiliar words, they used phonological recoding.
It is also moot that they use context to predict upcoming words (although I do understand about priming effects). There is also the possibility that rapid, automatic and unconscious decoding is the mechanism of automatic word recognition (Dehaene). Possibly with context confirming that the word is correct? A reading sequence of ‘predicting’, then, presumably, checking for correctness of form and meaning (how? by decoding and blending?) seems like a strange use of processing when decoding gets the form of the word correctly straight away and immediately activates meaning.
I wasn’t saying that the educators’ assessment of alphabetic/phonics methods was right, just that it was what they claimed. Again, if they didn’t think that, why would alphabetic/phonics methods have been abandoned?
falling literacy standards
The data that I suggested weren’t available would enable us to make a valid comparison between the literacy levels of school-leavers (aged 13, say) at the beginning of the 20th century when alphabetic/phonics methods were widely used in the UK, and current levels for young people of the same age. The findings Maggie has cited are interesting, but don’t give us a benchmark for the literacy levels we should expect.
There is some post WW2 data in the Bullock report though it is held to be not totally reliable. However, it finds that ‘reading standards’ rose from 1948 to 1961 but then fell back slightly from 1961 to 1971. Make of that what you will!
national curriculum and standardised testing
The point I was trying to make was not about the impact of the NC and SATs on reading, but that the NC and SATs made poor readers more obvious. In the reading-ready era, some children not reading at 7 would have learned to read by the time they were 11, but that delay wouldn’t have appeared in national statistics.
As, indeed, it appeared to be doing so in Bullock (see above)
reading for enjoyment
Children leaving school without functional literacy is certainly a cause for concern, and I agree that methods of teaching reading must be implicated. But technological changes since 1990 haven’t helped. The world of young people is not as text-based as it used to be, and not as text-based as the adult world. That issue needs to be addressed.
Which, as you might guess, I would partially ascribe to adoption of Whole Word, Whole Language & Mixed Methods. I have watched the ‘simplification’ of text over my lifetime in the cause of ‘including’ the semi-literate.
I think there’s a political element too, in the rejection of ‘elite’ language (aka ‘big words’). I shall have to dig out my copy of ‘The Uses of Literacy’ I think, to see what literacy expectations there were of the 50’s generation. Could be instructive.
What I do find interesting, and perhaps pertinent to the question of ‘dumbing down’ being discussed in other twitter conversations, is that, although we don’t really know what percentage of the population were literate in the latter half of the 19th C and the early 20th C, popular texts and the media appear to have expected a far more complex vocabulary knowledge, and an ability to comprehend far more complex syntax, of those who could read, even of children.. Compare, for example, Beatrix Potter with ORT.
Huey, Dewie & Louie are the names of Donald Duck’s three nephews
There’s no Louie in this story yet.
Perhaps Walt was taught the rhetorical ‘rule of three’!
It’s sad that we don’t have a Louie (or a Lewie) to complete the triumvirate. They would trip so nicely off the tongue..