the MUSEC briefings and Direct Instruction

Yesterday, I got involved in a discussion on Twitter about Direct Instruction (DI). The discussion was largely about what I had or hadn’t said about DI. Twitter isn’t the best medium for discussing anything remotely complex, but there’s something about DI that brings out the pedant in people, me included.

The discussion, if you can call it that, was triggered by a tweet about the most recent MUSEC briefing. The briefings, from Macquarie University Special Education Centre, are a great idea. A one-page round-up of the evidence relating to a particular mode of teaching or treatment used in special education is exactly the sort of resource I’d use often. So why the discussion about this one?

the MUSEC briefings

I’ve bumped into the briefings before. I read one a couple of years ago on the recommendation of a synthetics phonics advocate. It was briefing no.18, Explicit instruction for students with special learning needs. At the time, I wasn’t aware that ‘explicit instruction’ had any particular significance in education – other than denoting instruction that was explicit. And that could involve anything from a teacher walking round the room checking that students understood what they were doing, to ‘talk and chalk’, reading a book or computer-aided learning. The briefing left me feeling bemused. It was packed with implicit assumptions and the references, presented online presumably for reasons of space, included one self-citation, a report that reached a different conclusion to the briefing, a 400-page book by John Hattie that doesn’t appear to reach the same conclusion either, and a paper by Kirschner Sweller and Clark that doesn’t mention children with special educational needs, The references form a useful reading list for teachers, but hardly constitute robust evidence for support the briefing’s conclusions.

My curiosity piqued, I took a look at another briefing, no.33 on behavioural optometry. I chose it because the SP advocates I’d encountered tended to be sceptical about visual impairments being a causal factor in reading difficulties, and I wondered what evidence they were relying on. I knew a bit about visual problems because of my son’s experiences. The briefing repeatedly lumped together things that should have been kept distinct and came to different conclusions to the evidence it cites. I think I was probably unlucky with these first two because some of the other briefings look fine. So what about the one on Direct Instruction, briefing no.39?

Direct Instruction and Project Follow Through

Direct Instruction (capitalized) is a now commercially available scripted learning programme developed by Siegfried Engelmann and Wesley Becker in the US in the 1960s that performed outstandingly well in Project Follow Through (PFT).

The DI programme involved the scripted teaching of reading, arithmetic, and language to children between kindergarten and third grade. The PFT evaluation of DI showed significant gains in basic skills (word knowledge, spelling, language and math computation); in cognitive-conceptual skills (reading comprehension, math concepts, math problem solving) and in affect measures (co-operation, self-esteem, intellectual achievement, responsibility). A high school follow-up study by the sponsors of the DI programme showed that was associated with positive long-term outcomes.

The Twitter discussion revolved around what I meant by ‘basic’ and ‘skills’. To clarify, as I understand it the DI programme itself involved teaching basic skills (reading, arithmetic, language) to quite young children (K-3). The evaluation assessed basic skills, cognitive-conceptual skills and affect measures. There is no indication in the evidence I’ve been able to access of how sophisticated the cognitive-conceptual skills or affect measures were. One would expect them to be typical of children in the K-3 age range. And we don’t know how long those outcomes persisted. The only evidence for long-term positive outcomes is from a study by the programme sponsors – not to be discounted, but not a reliable enough to form the basis for a pedagogical method.

In other words, the PFT evaluation tells us that there were several robust positive outcomes from the DI programme. What it doesn’t tell us is whether the DI approach has the same robust outcomes if applied to other areas of the curriculum and/or with older children. Because the results of the evaluation are aggregated, it doesn’t tell us whether the DI programme benefitted all children or only some, or if it had any negative effects, or what the outcomes were for children with specific special educational needs or learning difficulties – the focus of MUSEC. Nor does it tell us anything about the use of direct instruction in general – what the briefing describes as a “generic overarching concept, with DI as a more specific exemplar”.

the evidence

The briefing refers to “a large body of research evidence stretching back over four decades testifying to the efficacy of explicit/direct instruction methods including the specific DI programs.” So what is the evidence?

The briefing itself refers only to the PFT evaluation of the DI programme. The references, available online consist of:

• a summary of findings written by the authors of the DI programme, Becker & Engelmann,
• a book about DI – the first two authors were Engelmann’s students and worked on the original DI programme,
• an excerpt from the same book on a commercial site called education.com,
• an editorial from a journal called Effective School Practices, previously known as Direct Instruction News and published by the National Institute for Direct Instruction (Chairman S Engelmann)
• a paper about the different ways in which direct instruction is understood, published by the Center on Innovation and Improvement which is administered by the Academic Development Institute, one of whose partners is Little Planet Learning,
• the 400-page book referenced by briefing 18,
• the peer-reviewed paper also referenced by briefing 18.

The references, which I think most people would construe as evidence, include only one peer-reviewed paper. It cites research findings supporting the use of direct instruction in relation to particular types of material, but doesn’t mention children with special needs or learning difficulties. Another reference is a synthesis of peer-reviewed studies. All the other references involve organisations with a commercial interest in educational methods – not the sort of evidence I’d expect to see in a briefing published by a university.

My recommendation for the MUSEC briefings? Approach with caution.

not enough jam: select committee report on SEN legislation

Sad person that I am, I love reading Parliamentary Select Committee reports. Select Committees don’t always get it right, but they are an example of democracy at its most transparent. Evidence, written and verbal, is presented verbatim so anyone who cares to can see how the Committee has taken evidence into account in its recommendations – and anyone can learn from the expertise and insights of witnesses. And because government responses to Select Committee reports are also published, anyone can see how much notice the government has taken of the Select Committee – and therefore of the evidence presented. Just before Christmas, the UK’s House of Commons Education Select Committee produced a report on its pre-legislative scrutiny of the draft special educational needs legislation published in September this year. I want to comment on the report in the light of my previous post about upstream and downstream factors in the education system.

Evidence

The first thing that struck me about this report is that it is firmly grounded in the evidence submitted by individuals and organizations involved with special educational needs; almost all the recommendations are based on information from the frontline. The second thing was that it brings a systems perspective to the draft legislation. And the third thing (I have mixed feelings about this) is that I’m not the only Cassandra out there. The impression that the report as a whole conveys is that although the government’s intention and direction of travel in reforming the SEN system is heartily welcomed, that welcome is accompanied by long list of misgivings.

In this post, I want to list some of the key misgivings that emerged from the evidence presented to the Select Committee and then look at the upstream factors that might have prompted them.

Misgivings

Joined-up thinking:
• no statutory duty for health or care services to provide the support specified in the Education, Health and Care (EHC) plans
• questions over how EHC plans will fit in with adult Care and Support plans.

Assessments:
• doubts about the capacity within the system to carry out assessments – without enough people with sufficient expertise, young people will continue to need multiple assessments from different agencies as is currently the case
• a conflict of interest if assessment and service provision are carried out by the same parties.

Accountability:
• lack of clarity about who is accountable to whom for what and how that accountability can be enforced.

SEN Code of practice:
• to be revised, but not as a statutory document laid before Parliament.

Children and young people falling through the net:
• concern about children who have non-educational needs (e.g. pre-schoolers, children with disabilities but not SEN, young people in supervised work placements, apprenticeships)
• concern about children currently on School Action, School Action Plus or lower Statement funding ‘bands’ levels – SA and SA+ categories will disappear.

The Local Offer:
• no minimum standard required – concern that LAs will simply provide a service directory
• no minimum requirement regarding parent participation – a risk that parent participation will be tokenistic

The task of government

As I see it, the primary task of government is to ensure the maintenance of an infrastructure that allows the community it serves to go about its lawful business without let or hindrance. That doesn’t mean government has to design the infrastructure – the evidence suggests that design is far better left to people with relevant expertise. But government does need to maintain an overview – to make sure the different parts of the infrastructure interact effectively, to legislate in order to resolve conflict and to ensure the community’s cash isn’t wasted. Government departments have different areas of responsibility and one of the tasks of the Prime Minister or his/her office should be to ensure that those departments interact effectively. This is a thankless and difficult task and conflict between government departments is unlikely ever to be eradicated, but someone, somewhere needs to have oversight of what’s going on in different departments to ensure that government policy is coherent – that legislation drawn up by one department isn’t going to conflict with legislation drawn up by another, or that budgets aren’t going to scupper policy. Unfortunately, in the case of the draft SEN legislation, this doesn’t appear to have happened.

The biggest reform in SEN legislation for 30 years is being introduced at the same time as the NHS is undergoing the biggest structural change in its history, the school leaving age is being raised to 18, school funding is changing to reflect the increasing autonomy of schools and public sector budgets are being cut year-on-year for the foreseeable future. The SEN legislation rests on several assumptions about the way other public sector services will be working. But no one actually knows how they’ll be working. Witness after witness drew the Committee’s attention to the large number of ‘unknowns’ in the proposed SEN equation.

Sub-system optimization

The SEN legislation is a perfect example of what’s known as sub-system optimization at the expense of whole system optimization. In other words, the proposed SEN sub-system on its own might be great; but the SEN sub-system doesn’t exist on its own, it interacts with several other systems many of which are also undergoing change. Re-designing a service so that it works effectively is a challenging task and one that’s best undertaken by a team of people who have expertise in different aspects of the service, in consultation with a wide range of those working at the front-line – including service users. The reason for this is not to ensure that all parties feel they have been consulted, but to avoid the unforeseen and unwanted outcomes of poorly designed legislation that often end up as part of the judiciary’s caseload. Large-scale or rapid structural changes should be undertaken only when absolutely necessary otherwise there is a big risk of costly knock-on outcomes elsewhere. Over recent decades, the speed with which legislation is introduced seems to have gathered pace. This is certainly true for special educational needs legislation.

The Warnock Committee responsible for the previous re-design of SEN provision was set up in 1974 and consisted of 27 members. Its terms of reference were as follows;

To review educational provision in England, Scotland and Wales for children and
young people handicapped by disabilities of body or mind, taking account of the medical
aspects of their needs, together with arrangements to prepare them for entry into
employment; to consider the most effective use of resources for these purposes; and to
make recommendations
”.

The Committee took nearly four years to report and legislation based on its recommendations wasn’t enacted until 1981. The recent equivalent was the Lamb Inquiry. Its Expert Advisers Group had six members (although it had a larger Reference Group). It was commissioned in 2008 in response to Select Committee reports critical of SEN provision published in 2006 and 2007, reported in 2009 and its recommendations have prompted legislation that has been drafted before pathfinder local authorities’ pilot studies are complete. Its terms of reference are very different from those of the Warnock Committee, focusing on parental confidence in the SEN system:-

In formulating their advice, the Inquiry would:
●● consider whether increasing parental confidence could be best achieved by:
–– making the provision of educational psychology advice ‘arm’s length’ from
local authorities;
–– sharing best practice in developing good relationships between the
authority and parents, through effective Parent Partnership Services and
other local mechanisms;
–– effective practice by schools and local authorities in meeting the needs of
children at School Action Plus;
–– developing the ‘team around the child’ approach in the school stages;
–– other innovative proposals;
●● commission and evaluate innovative projects, in the areas identified, that can
demonstrate the impact on parental confidence of a particular approach;
●● draw on the evidence of other work currently commissioned by the
Department;
●● take into account the evidence of the submissions to the two Select
Committee Reports in 2006 and 2007.

In 1981, the changes resulting from the Warnock report would have been applied to a fairly flexible education system – it would have been up to individual schools or local authorities how implementation took place. A decade later, a compulsory national curriculum and standardized testing had completely transformed that educational landscape. Ironically, the SEN reforms had been both introduced and undermined by changes to the wider education system by the same person – Margaret Thatcher. The constraints imposed on schools and local authorities by performance indicators have led to unforeseen and unwanted outcomes for children with SEN.

Unforseen and unwanted outcomes

The recent Select Committee report draws attention, for example, to the disincentives in the education system for schools to educate children with special needs. The NASUWT cites the case of the flagship Mossbourne Academy in Hackney (founding principal Sir Michael Wilshaw, currently Chief Inspector of Schools) where parents have successfully challenged the school in relation to admission of pupils with SEN. My attempts to find a reference to ‘special educational needs’ on Mossbourne’s website met with failure – as they did on a number of websites for secondary schools in my local area. This might be because the search function on the websites doesn’t work – but frankly, I doubt that’s the cause.

In addition, giving schools increased autonomy and removing them from local authority control has resulted in a lack of clarity about who’s responsible for what and to whom. Edward Timpson, Parliamentary Under-Secretary of State for Education assured the Committee that

all schools will have a vested interest in ensuring that the services that they have available are part of the local offer. Parents will be able to hold them to account for whether they do or they do not” (para.138)

I suspect the Committee wasn’t assured, since this means that the only way for parents to ultimately hold schools to account will involve taking legal action against them – which many parents will be unable or unwilling to do.

In short, making sure that a suitable education is available to all children and that schools actually provide that education is no longer safeguarded in the design of the system – by, for example, ensuring that all education providers have ready access to relevant expertise and resources and that there’s a clear pathway of accountability that doesn’t require parents to resort to legal action. Instead, government appears to see its role as having good intentions.

In response to the Select Committee’s suggestion that the draft clauses in the legislation lacked substance the Minister stated;

“I am confident—and it is borne out in many of the conversations I have already had with many of those who played a part in bringing it together—that it does illustrate, very clearly, the ambition of this Government and many other people to ensure that the system we move to is a vast improvement on the previous system” (para.13)

That might be perfectly true, but ‘ambition’ isn’t all that’s required to design and run an education system, health or care service. As I see it, over recent decades governments have become increasingly involved in the design of public sector services for political reasons, but are reluctant to take responsibility for flaws in the design of those systems – flaws that are unsurprising given the unavoidable lack of relevant expertise of government ministers and their special advisers.

Upstream factors

I said I’d look at upstream and downstream issues. Not surprisingly, the factors I flagged in my previous post – lack of expertise, insufficient resources and capacity and inadequate needs analysis, cropped up in the evidence submitted to the Select Committee.

Expertise The NUT drew attention to the fact that schools were already reporting difficulties accessing specialist advice regarding children with School Action or School Action Plus support, implying that at least some teachers don’t currently have the expertise required to support children at these levels. Witnesses also asked for the legislation to require SENCOs to have appropriate training.

Resources and capacity The difficulties experienced in accessing specialist advice suggest some local authorities are already cutting back on support services. One headteacher had been told by her local authority that children currently with lower band Statement funding would not be eligible for EHC plans. Funding cuts across the public sector have significant implications for the viability of the SEN proposals.

Needs analysis The task of local authorities is, and always has been, to provide services that meet the needs of the local population. By now, LAs should have accumulated sufficient information about the needs of local children to have a reasonably accurate idea about what services those children need. But currently, many LAs prioritise the needs of children with severe difficulties, suggesting that services are not based on need, but on budgets. The NHS hasn’t been around for as long as local authorities, but 60 years is quite long enough to have formed a good awareness of what children’s needs are. But long waits for diagnoses, to see specialists or get wheelchairs suggest that again, children’s healthcare is based on budgetary considerations rather than needs.

Not enough jam

In a letter to the Education Select Committee, Sarah Teather, responsible for the Green Paper that initially set out the proposals for change to the SEN system, asked whether there was ‘a case for extending the scope of the integrated provision requirement to all children and young people, including those with SEN’ (para.73). The consensus amongst witnesses was that doing this would mean ‘spreading the jam too thinly’.

One can appreciate concerns about limited resources being diverted from those who need them most, but this response does beg a couple of questions: The first is ‘Why are children categorized as those who need jam or those who don’t?’ Difficulties that require educational, health or social support are distributed across the population and vary during the lifetime of the individual – some children need more support than others and some might need support at some times but not at others. In other words, all children need access to the jam, even if they never need the jam itself. The second question is ‘Is there enough jam in the pot?’ If service design is based on the outcomes of a needs analysis, there should be. If service design is based on budgets, then assessments determine children’s eligibility for support, not what their needs are. And if there isn’t enough support to go round, this means that there are likely to be children who need support but who aren’t getting it.

The saying ‘children are our future’ might sound trite, but it’s still true. Child abuse by individuals has, rightly, received a great deal of attention in recent years. But public sector systems that withhold support from children who need it is also abusive and needs to be addressed as a matter of urgency. Treating children with special educational needs and disabilities as second-class citizens is a self-fulfilling prophecy.