Following the Script – Why Use a Commercial DI Program for Maths Education?

In yesterday’s post, I wrote about the fact I use both constructivist and direct instruction approaches to teaching in my school, and referred to the Dylan Wiliams quote:

‘… in education, “What works?” is rarely the right question, because everything works somewhere, and nothing works everywhere, which is why in education, the right question is, “Under what conditions does this work?” ‘

In today’s post, I’m going to explain a little more about a new Direct Instruction strategy I’m using at my school, and why I chose it to suit our particular conditions.

In this post, when I refer to Direct Instruction, I’m referring to that extreme end of the continuum: a commercially created program, where lessons are scripted so that all students are taught in the same way at the same time.

It’s an approach I’ve been wary about.

I’m wary of scripts for teachers – they can be de-professionalising, reducing our job to something that is almost robotic, programmable, unskilled. I’m wary of commercial programs that make big claims backed by research that is funded and conducted by the people who are profiting from selling the programs. I’m wary of anything that is a one-size fits all approach, and I’m wary of entirely teacher centred approaches to teaching mathematics especially those which don’t include hands on work with manipulatives.

So why, in-spite of those reservations, would I choose to use a commercially produced. highly scripted Direct Instruction program to teach mathematics at my school?

Simple: I have a group of 5 students in Year 5 and 6 who are working several years below grade expectations and who struggle with just about every aspect of numeracy. We’ve tried many interventions over years, but none have had the impact we would like. It’s time to try something new. JEMM is a fast paced intervention that takes just 15-20 minutes per day so the students are still able to participate in their regular Stage 3 maths lessons. It is published by the Australian Council for Educational Research, and supported by a body of evidence which includes the PHD thesis of the program’s author, Dr Rhonda Farkota. For my school, it was a low cost and low risk strategy, which, if it achieves what the publishers promise, will help these struggling students to make significant gains. If not, we lose very little.

Our first lesson went well. It felt odd following a script, but my students were highly engaged. The brisk pace held their attention, and for these students, that in itself was a great achievement. The lesson followed a pattern in which I would begin by demonstrating a concept on the board. This would be followed with a simple question requiring them to apply what had been demonstrated, which they would answer in their workbooks. Each question covered a different strand of mathematics, and this variety may also have helped keep their attention.

Following the instructional period, which was meant to take 15 minutes, but in our case was closer to 10, we  had 2 minutes to mark our work (it took less than one minute) and 3 minutes to ‘debug’ or go over problems.

I hadn’t expected to encounter any problems for ‘debugging’ in our first lesson. In fact, this was one of my concerns before starting the program. The instructions are very clear about starting all children at the first lesson, but the content appeared far below their ability. However, they surprised me with two errors.

The first was the question ‘How many digits in the number 10′. One of my students wrote ’10’. When we ‘debugged’,  it turned out that he did not understand the term ‘digit’ even though it had been explained in the demonstration part of the lesson. We spent some time reviewing that term and by the end of our debugging session, he was able to confidently tell me the number of digits in any number up to 5 digits long.

A second error was made when I asked the students to write how many squares were shaded in an array. The answer was ‘2’. One of my students wrote ’10’ because he was thinking about how many squares the array had in total, rather than just the shaded squares. In this case we reflected on the importance of listening and thinking about the whole question. He’d jumped out of the blocks too soon, answering after just the first few words: ‘How many squares…”

Both of these errors were important. They showed me that my students had some gaps in their mathematical language, and also had some issues in listening to and interpreting questions. We were able to start filling those gaps and addressing those issues in our first lesson. And, as the following 4 lessons cover exactly the same concepts, the daily, repeated practice should help students retain knowledge and develop better learning habits.

So it turned out working to a script was not de-professionalising at all. I drew upon my professional expertise to recognise indicators of engagement and learning, as well as areas that my students struggled with and in which they need continued support. I used my professional judgement in deciding to use a script, and I’m using my professional judgment in deciding to continue with it for the time being, at least.

It’s impossible to evaluate a program after just one session, but the initial experience was very positive. And best of all, the students loved it. They are looking forward to lesson 2, on Monday.


Direct Instruction vs Student Driven Learning

If you’ve followed me for a while, you’ll know that I love to incorporate opportunities for student driven learning in my classroom and across my school. I promote and use strategies like Project Based Learning, SOLE and Genius Hour, and have led school change projects to introduce these strategies into classrooms across my school.

So it may come as a surprise that I also use Direct Instruction with my students. For many commentators, the two approaches are incompatible – apparently you can’t support constructivist approaches while also supporting DI.

But while the debate around these approaches becomes increasingly polarised, I have a school full of students with a range of needs for whom I’m responsible. I make my decisions based on what I believe will work for the students I have in my school right now.

As Dylan Wiliams said at ResearchEd in 2014 , ‘… in education, “What works?” is rarely the right question, because everything works somewhere, and nothing works everywhere, which is why in education, the right question is, “Under what conditions does this work?” ‘

Formative Assessment (Part 4): Student Goal Setting

This is part four in my series on formative assessment where I blog my journey through Dylan Wiliams’  book, Embedded Formative Assessment and share my school’s change journey.

At the beginning of 2015, we asked our teachers to consider how they might introduce individual goal setting based on the continuum into their teaching programs. We’d already plotted our students on the continuums (read about that here) but it wasn’t enough merely to work out where students were ‘at’. We needed to work out the steps that would continue to move their learning forward. The continuums are excellent tools for this as they break skills down into a hierarchy, making the next steps very clear.

Two of our Year One teachers seized upon the idea of goal setting enthusiastically and decided set up to three goals for each student. Examples include: ‘Use a capital letter at the beginning of a sentence’, or ‘Leave a space between each word’. They are simple, specific and easily observed. Each student has their learning goal displayed on their desk to remind them of what they are working towards. As the teacher moves amongst the students, they can refer to each child’s goal and provide immediate feedback as to whether they are achieving it. The process is motivating for our students and helps them to focus their efforts. As the goals are small and simple, students quickly experience success. Our teachers track these successes as they happen, and set new goals to continually move the learning forward.

In my own case, I’ve found the process has already achieved great success with my support students in Years 3-6. Once a week I work with a small group of Year 4 and 5 students who require additional support in literacy. The gap between their current ability and the desired level of proficiency is quite overwhelming. Setting specific goals helps us to focus, and makes learning seem much more achievable.

In our first session, I gave the students copies of the writing aspect of the continuum, re-written in child friendly language. We used it as a checklist, working across the continuum, ticking off all the markers they felt they had achieved. When we arrived at Cluster 5, which included the ability to write 4 or 5 sentences about a topic of interest, my students stopped. They had assessed themselves honestly, as none of them were currently demonstrating that skill. They also identified a few other markers they needed to work on in that cluster, such as re-reading their work to ensure it makes sense.

For each student, I made a goal sheet with the 2-3 goals they had selected written in child friendly language. It included check boxes where they could tick each time they demonstrated they had met that criteria. They would bring their writing book to our weekly sessions with the goal sheet pasted inside the cover. If things were going well, we’d reflect on what helped them to meet their goal. If they were not achieving, we’d reflect on what changes might need to be made to help them succeed. While I provided some explicit teaching focused on their goals, my role started to become closer to that of a coach than a teacher.

My students loved having their explicit goals. They enjoyed having a visual reminder to help them focus. They loved showing their teacher their writing and receiving a tick each time they demonstrated they had met the required standard. They proudly showed me the evidence of success each week in their books. They engaged well with my lessons, worked hard, and within 3 weeks were already consistently demonstrating success. They are now working towards a new set of slightly more challenging goals.

In his book, ‘Open’ David Price writes about the value of sharing our practice and work with others. By doing this, we continually add value and improve as a whole community. The great improvements I am already seeing at my school, in just five weeks, are a result of being a part of an open, sharing community.

Last year, I took a team of teachers to visit  Mount Pritchard East Public School in South Western Sydney. Much of what we are bringing into our school is inspired by the practices we saw there. Our first data wall, for example, was modelled on the data wall that they are using to track their students across the Literacy continuum, and some of the goal setting practices that I describe here, also grew out of what we saw in place there.

Our child friendly goals were also developed by other teachers. I found them on this amazing website, curated by Shellie Tancred, a teacher with the NSW Department of Education and Communities.  The website itself is a collection of resources created by teams of teachers from many schools.

Other posts in this series:

Formative Assessment (Part 3): Data Walls for Collaboration and Dialogue

Formative Assessment (Part 2) and a case for differentiated instruction

Formative Assessment (Part 1) Introduction

Formative Assessment (Part 3): Data Walls for Collaboration and Dialogue

This is part three in my series on formative assessment where I blog my journey through Dylan Wiliams’  book, Embedded Formative Assessment and share my school’s change journey.

Some of the most helpful and transformative tools for primary educators have to be the NSW Literacy and Numeracy Continuums. They unpack areas of the English and Mathematics curriculum, breaking them into small, hierarchical steps which we can use to pin point students’ current level of skill and set goals that will move learning forward.

This year we’ve placed the continuums at the centre of our planning, implementing some high impact strategies that have already boosted the learning of our students and increased the effectiveness of our teaching. I’ll unpack these strategies in a series of posts. Today’s is about data walls.

I’ve heard some terrible stories about data walls: stories where they’ve been used to judge the effectiveness of teachers, evaluating them according to how rapidly their students progress. I’ve also heard of them being used in a way that publicly rank students, turning their education into a race, celebrating the winners and humiliating the losers.

Our data wall serves neither of those ends. Displayed in a  private corridor that only staff can access, ours is  a tool that we use to set learning goals for our students, identify students who may need extra support, allocate resources appropriately, track student progress and stimulate professional dialogue.

It  plots every child in our school along the Literacy Continuum and provides us with an instant visual representation of where our students are at.


Along the top of our data wall, we placed the continuum clusters. These were colour coded according to the year level we expect to be working in that cluster. For example, the headings for clusters that Year 4 students are expected to be working at were in Yellow. The Year 4 students’ names were also printed on yellow paper. At a glance we could see that most of our Year 4 students were working at their expected level. But we could also see outliers – we could immediately identify that some students were working not just ahead or behind their expected level, but that some were working years ahead or behind.

Presenting the information visually immediately triggered professional dialogue.

One year group of teachers, viewing the spread across the grade, immediately identified that their judgement was not consistent. Some students with known difficulties, appeared to be achieving at the same level, or even ahead of students  in their grade whom we knew were actually very high achieving. Upon seeing this, the teaching team started discussing what formed their judgements and questioning what achievement would  look like at different points of the continuum. They reexamined the curriculum and their expectations.

Professional dialogue also sprang up between our support teachers and class teachers. Our learning support and  EAL/D teachers noticed straight away that they did not agree with the placement of some students. They work with students in different contexts from the class teachers and therefore see different evidence of their learning. This led to productive discussions between the team of teachers who work with students. More view points and evidence was considered, some placements were altered and the teachers walked away with a much deeper understanding of their students.

We quickly found information that influenced the direction of our teaching and learning programs. In one of our early grades, we noticed that while most of our students were tracking well against a most reading indicators, such as phonics, comprehension, vocabulary and  fluency, a significant group were below the expected level for phonemic awareness. Upon seeing this, grade team decided to target phonemic awareness in their programs for the term to bring all students up to an improved level of proficiency.

We were also able to quickly prioritise resources and support. Our EAL/D funding was cut in half at the beginning of 2015 – reducing our teacher allocation from 4 days  to 2 days. We had to make changes to the way in which we supported these students. The data wall helped us to identify those EAL/D students with the highest need of support and to develop learning goals and strategies to move them forward.

One of the most significant things about the dialogue sparked by our data wall was that it was entirely teacher driven. Instead of the executive team having to MAKE the conversations happen, or CONVINCE the teachers that we needed to re-examine our practices around, for example,  phonemic awareness, our teachers came to these conclusions themselves. The collaboration and dialogue was spontaneous, unplanned and grew out of having data displayed in such a clear, visual manner. There was immediate buy in.

Of course, our work with the continuums and our data wall didn’t stop with those initial conversations. Since then, we’ve used them to make some simple, but exciting changes to the way we teach. I’ll write about these, and how they fit with our formative assessment process in a future post.

Other Posts in this series:

Formative Assessment (Part 2) and a case for differentiated instruction

Formative Assessment (Part 1) Introduction