Marking – something needs to change!

A great blog post from Tom Bennett – thoughts to me please. Tim.

Saturday, 15 October 2016

It’s your time you’re wasting; why schools should stop drowning teachers in marking

Of course, it’s optional
One does not simply walk into Mordor, and one does not simply pop into IKEA for a packet of napkins and an Ottoman. The Scandinavian elves play a voodoo on your flimsy aspirations of frugality, and by the time you’re supping on a hot dog in the car park of Valhalla you’re dragging a caravan of Billy bookcases, tea candles, picture frames and a rug that doubles as a shoe tidy. And you forgot the Ottoman.

We’ve all done it; started out with one plan and ended up with another. That’s fine when Plan B is also something you want (cf: Professor Mickey Flanagan’s seminal  ‘Out/ OUT-out theory of organic incremental decision decay’ for details). But not if you put your hand in your pocket for a Swiss knife and pull out a Swiss roll. And not if you planned on teaching kids, but ended up doing something else that looked a bit like teaching, but wasn’t really.

I was reminded of this recently when I heard of a colleague’s experience in a struggling school in the Midlands. The school was staring down the barrel of Special Measures; its previous visit from MiniLearn saw their pockets picked of their previous Good rating, downgraded to RI. Alarms bells they no longer knew they possessed blew like Louis and the walls came tumbling down. Action Stations. Dust blew off the Burgundy book. Steam Engine Time. Something must be done was the whole of the law.

But what? Sadly, the answer was ‘triple marking’, because as we know, nothing animates and activates deep, deep learning like spending all day on one piece of work, endlessly batted between the teacher and the taught in a show trial of pedagogy, with as much measurable impact on progress as a fruit fly trying to push the Moon out its orbit. And homework; reams and reams of it, marked to a metronome in a fool’s rubric. Never mind that this simple edict suddenly took up around a third of the teacher’s total- not free- time. That’ s gross, not net. Imagine if I said to you that a third of your career would now be spent, not teaching, or having meaningful conversations with students, or reading up on your subject, but flicking, ticking and wondering when Morpheus was going to show up so you could scarf both pills.

At a previous school I taught humanities to 10 or 11 classes of approximately 25 kids apiece. So let’s say 250 pupils. Then they announced the expectation was weekly homework set, with marking. Even a speedy romp with a red pen would easily see that converted into 250 minutes per week- if all I did was turn the pages and make a mark to say ‘I was here.’ Anything more than that meant 5 minutes a book, or 1250 minutes. A sixth form essay with comments? Christ, you need a Tardis and a magic lamp to get that polished off

Not waving, but marking

250 pupils flick and tick- 250 minutes, or 4 hours 10 minutes
250 pupils flick and an end comment- 500 minutes, or 8 hours, 20 minutes
250 pupils with substantive comments- 1250 minutes, or 20 hours and 50 minutes
250 pupils with substantive comments and spelling/ grammar correction- haha you’re kidding mate who do you think I am, Ali Bongo?

And I’ve seen teachers try to match this, because schools ask them to. Bye-bye weekend and every evening and your marbles.

All that time has to come from either you, or the students. Now the standard response from anyone foolish enough to demand this in the first place, is ‘Set homework that doesn’t need much marking; or can be marked by peers.’ And I would agree, which is why we now see rainbows of pen colours indicating ‘marked by a peer/ marked by myself/ marked by a unicorn with a lisp’ etc. Problem solved? No, problem shifted, because that kind of marking doesn’t really show progress, or the Holy Grail of book marking: progress as a result of teacher intervention. So, you have no option but to triple, quadruple, octuple mark, or devise tortuous exercises where children fill out sheets designed to capture comments like ‘I now understand this activity because…..and I have achieved this by….’ Ghastly.

I have a simple attitude towards time management in an enclosed system: the investment has to be worth the dividend. If I’m asked to spend a third of my time on activity x then I expect that activity x should account for an equivalent third of their learning. In a school, opportunity cost is all; if we’re doing one thing, we’re prevented from doing another. And time, like land, is the one thing they aren’t making any more of. Triple marking simply doesn’t produce anything like a result that can match its cost. In fact, I’ll argue that most homework has the same problem, especially if it entails marking.

‘Just a couple more sets to mark lads!’

Three are many other displacement activities we could do without: poem tasks when the subject isn’t poetry; art and design tasks when we’re studying religious food laws; colouring in; making volcanos.; puppet shows and role plays. I know many teachers are prepared to fight to their last breath defending these things, and they may at times have merit as pace-regulators or pauses between content. But too often they represent a disproportionate investment of time in a system where time is a treasure chest. And when workload is the lash, the goad and the rack of possibility, spending each second wisely is no longer a luxury.

These damnable chronophages are designed to make teachers  prance on command for fear of a real or imagined Grendel. I once wrote that the best thing to do on the day of an Ofsted inspection was to get your Free School Meal kids to perform ‘Consider Yourself’ from Oliver! With their target grades painted on flat caps. I didn’t know that in a few years reality would render my satire useless.

Mungo just pawn in great game of life

Just as teachers wind up- if their nerve isn’t strong or their hearts true and pure- teaching to the test rather than teaching brilliantly and letting the test discover it, schools can easily fall into a pit where the appearance of progress becomes more important than the progress itself. I see many, many schools where the directed activity of the teacher has nothing to do with actual learning, and everything to do with showboating. There’s a wonderful scene in Mel Brooks’s genre opus Blazing Saddles where the Sheriff and the Waco Kid animate a moribund citizenry of beleaguered settlers to stand up to a pack of desperadoes by building a fake town for them to plunder instead. I think this is how many schools approach an inspection; see our beautiful data and our books of interventions and can we interest you in a jelly baby? Look how we’ve grown since last we spoke!

Enough. Enough. Ofsted have been quite clear that they don’t require any particular scheme of marking, any preferred assessment regime, any particular liturgy of when, how often and how books are marked. There is no activity or strategy or teaching style beloved or scorned to which teachers should aspire. Wilshaw, the present Prospero of Ofsted, is quite clear on this. And yes, I understand why schools do this. In desperation, a rat will chew through it’s leg to escape a trap, and dogs will bark at cars. But that shouldn’t be policy. The inspection regime is partly responsible for this of course. But if we ever want to be seen as a profession and not an army of complainants, it’s time we took action at a level we can affect.

We’ve found so many lovely ways to fill our time that we’ve forgotten what we came to do. The tragedy is that sometimes we can forget there ever was anything else we did, and the tragedy squared is when kids start to think like that too.

Advertisements

20 ways to widen the ‘gap’ in your classroom!

A great post from Miss Cox – link to her site is below. I hope we don’t do any of these!

IMG_2779

  1. Make homework optional
  2. Create resources for different levels/grades of students
  3. Only teach certain groups of students the tough stuff
  4. Take under achieving students out of one subject to catch up with other subjects
  5. Allow absence without any action
  6. Don’t make students catch up with work when absent
  7. Make judgments/decisions using student data/hearsay, before you’ve met them & seen what they can do
  8. Treat PP/LAC students differently (marking their books first won’t close a gap)
  9. Think that an SEN student cannot learn the same and in the same way as non-SEN (in the majority of cases)
  10. Don’t check students’ work regularly and hold them to account for incomplete/unsatisfactory standard or work/presentation
  11. Use marks/grades/levels on student work
  12. Talk about attainment instead of improvement
  13. Leave a piece of work unimproved by the student
  14. Tell them they’re weak/lesser/in a bottom set
  15. Assume they know how and what to learn
  16. Assume that if you’ve said something once, it’s enough
  17. Have discussions about groups of children instead of individuals
  18. Don’t follow through things you say you will do with students
  19. Don’t follow school systems with a student/s because they’re a ‘special case’
  20. Don’t ever contact home or involve them in the student’s learning.

A Fantastic post from Tom Sherrington – this structure will help all of us, let me know what you think – Tim.

egypt

It’s a well-established idea that, to develop expertise in a particular skill or technique, you need to practise. The more you practise, the better you get.  As outlined by the excellent people at Deans for Impact in their Practice with Purpose document, it helps to identify a specific element of your teaching to practise on and then focus very deliberately on improving in that area.

Screen Shot 2017-04-17 at 00.56.10

Instead of flitting from one thing to another, dipping in and out, the suggestion is that teachers would do better to select one thing from all the options and try hard to keep at it until the practice feels more like a habit. This approach absolutely applies to numerous elements of behaviour management and most of the Silver Arrows I highlighted in this popular post.  However, for this post I wanted to focus on pedagogical elements of teaching.

Here are ten things you might want to try to practice – deliberately:

1. Developing routine knowledge recall procedures.  

It takes practice to establish this as a snappy, low-stakes routine, conducted in a disciplined fashion, at a frequency that really helps your students to retain the knowledge you’ve taught them.  You need to establish a pattern that you can stick to:

  • identify the specific knowledge elements that lend themselves to snappy tests – a knowledge organiser broken into sections that students can focus on.
  • a quizzing method that students are familiar with and can organise readily – are you going to read out the questions, prepare each test or use ppt slide?
  • a quick method for self or peer checking of the answer – eg with answers on a visualiser or ppt slide.
  • a routine that returns to the same knowledge elements repeatedly so that the recall is strengthened; it needs not to take up too much time in any given lesson and happen often enough to become low stakes and habitual.

Develop the technique with  multiple choice questions,  sequencing of concepts/events and more sophisticated ‘which is a better answer’ style questions.

2.  All-student response: using mini-whiteboards really well. 

As I outline in this post – the No1 bit of classroom kit is a set of mini-whiteboards. The trick is to use them really well.  You need to drill the class to use them seriously, to do the ‘show me’ action simultaneously in a crisp, prompt manner and, crucially, you need to get students to hold up the boards long enough for you to engage with their responses. Who is stuck? Who has got it right? Are there any interesting variations/ideas? Use the opportunity to ask ‘why did you say that? how did you know that?’ – and so on. It takes practice to make this technique work but it’s so good when done well.

3. Questioning techniques: 

Each questioning technique takes practice, especially if you are in the default-mode habit of asking the whole class every question and taking answers from those with their hands up. Make a deliberate effort to try out and practise these methods:

  • Random selection: use an online name generator or lollisticks or some other means of selecting students at random. It’a powerful effect. (Lollisticks need to be a no-nonsense practical tool, not a fussy gimmick – I’ve seen this done superbly well.)
  • Cold Calling: just check out technique 33 in Doug Lemov’s Teach Like a Champion 2.0.  I prefer this when combined with wait time and the name selected after the question. eg “What is 7 cubed?…..pause….. John?”  With “John, what is 7 cubed?”, only John has to think about it.
  • Probing:  routinely ask follow-up questions for every question you ask, two or three times.  Go deeper.  I’ve explored this in Great Lessons 1: Probing. 
  • Going Dialogic.  An extension of probing – you set up the expectation that one student might engage in an extended dialogue to probe ever more deeply into their understanding with the rest of the class as an audience.  It takes practice but works incredibly well. See Pedagogy Postcard 1.

4.  Think Pair Share

A strategy I firmly believe is underused relative to its power.  It takes practice to make it a routine with the necessary behaviour management strategies.  It is fully explained in this post: The Washing Hands of Learning

5. Metacognition and modelling

NVR

Metacognition scored very highly in several ranked lists of effective teaching and learning strategies – eg Hattie’s visible learning effects or the EEF toolkit. In a nutshell, it is the process of teaching students how to solve problems and complete complex tasks by  making the strategies and thought processes explicit by modelling them.  For example, in these non-verbal reasoning questions, you can show students how you go about solving them, narrating the process explicitly including double-checking all the wrong answers. This is something they can then practice.  It works for modelling writing too – you need to  walk through the full details of how you construct sentences and paragraphs to convey what you want to say in the way you want it said.  Doing this well takes practice – try it.

Look no further than John Tomsett’s posts on this, featuring some videos of modelling in action:  Modelling and meta-cognition – and this one too. 

6. Whole-class feedback instead of marking

Instead of slaving away late into the night with your red pen poised to ink up a massive set of exercise books, just read this brilliant post by Jo Facer: Giving feedback the ‘Michaela’ way.  Read through the books, make some notes and give whole-class feedback instead.  Do it over and over again and get good at doing it – practise. It’s a game changer.

7.  Critique-method feedback

Instead of merely nodding in jaded recognition at the Austin’s Butterfly video, why not actually use the critique method it describes and develop real expertise with it.  There are lots of ideas and resources to support you – nicely compiled in this excellent post by Dave Fawcett Creating a culture of critique .  Let’s see your students developing the expectation that their work will be critiqued in a  specific, support manner allowing them to reach higher standards than they thought possible.

8.  Deliberate vocab development 

This links to the recall method above but here I’m thinking about a technique to cement vocabulary development specifically.  Very often new words are encountered in lessons and teachers might explain them at the time – only for them to be completely forgotten about and, consequently, not learned.  I suggest adopting a routine:

  •  a region of a board is dedicated to new vocab;
  • new words are listed during the lesson with awkward spellings explored explicitly
  • new words are sounded out through choral repetition so that students all experience saying the words
  • students are asked to put the words in a sentence orally or in a place in their books for new words
  • the lesson list forms the basis of a systematic recall test the following day/week/month – something students learn to expect thus supporting their engagement with the words in the first place.

9. Embedded tiering:  Mild, Spicy, Hot or Challenge, Turbo-challenge

Instead of differentiation meaning providing different work, develop a collaborative planning approach where question relating to any given topic are constructed with in-built tiering.  I’ve seen this used superbly well at primary and secondary with labels such as bronze, silver, gold; mild, spicy and hot or, Core, Challenge, Turbo-Challenge.

This is not the same as setting artificially differentiated learning objectives – but it supports the organisation of a class where students progress at different rates, allowing everyone to find a suitable challenge level (seeking an optimal 80% success rate).  Practice is needed not only to devise really good tiered sets of questions that still offer enough repetition at each level – but also to manage the learning in the classroom when everyone has diverged from the initial instruction phase.

10. Third time for excellence: Draft, re-draft, publish.

Again, taking something from Austin’s Butterfly, try to create space in your curriculum planning to go the whole hog on redrafting so that students get to the third version: the third draft of a poem, story, essay or piece of writing in French; the third attempt at a painting; the third run-through of the performance, recitation or speech.  The first one might be ‘a great start’. After feedback, the second version is a big step forward, taking the feedback onboard.  But you will find that Version 3 is where you see Excellence emerging. This is where it gets exciting.   You can’t do it for every piece of work – so pick your moment – but when you can, go for the power of three.  You can get better at this – more streamlined; less bogged down in the individual feedback; less fussy about every detail of the first draft, focusing on specific elements over others.  Try it.

Let me know how you get on.

Teaching for distinction

A really interesting blog from Tom Sherrington about teaching for distinction:

Teaching for Distinction @OldhamCollege

The most exciting job I’ve had since starting out with Teacherhead Consulting has been working with Oldham College.  Principal Alun Francis approached me to explore whether there was scope in applying current thinking around teaching and learning, curriculum planning and my experience of the delivery of CPD in schools to the FE setting.   He was keen to move away from the one-off CPD day where the impact can be marginal.  He was also keen to explore the idea of a ‘powerful knowledge’ curriculum in FE, not least because so many technical and vocational courses are moving to include examined components.

Working with Alun and Rachel Irving, the Head of Teaching and Learning,  I spent a few days in Oldham talking to a range of members of staff – tutors, Heads of Faculty, members of the senior leadership team.  I observed a few lessons and got a feel for the context.  FE is radically different to school in some ways:  there are huge plumbing workshops, rows of painting and decorating booths, hair and beauty salons, design studios.  Oldham even has the front half of an aeroplane parked outside. Some students are engaged on 100% work-based learning programmes where they receive visits from assessors.  The Maths and English departments are dealing with over 1000 students all taking the same GCSE resit course.   As well as the scale, the language is different: it’s all about learners, tutors and ‘quals’.

But, the basic business of teaching students so that they succeed is the same. Students need structure, guidance, support, quality instruction, high expectations, feedback, chances to improve.  The also need to acquire knowledge.  Teaching for Distinction has a clear double meaning. We want students to reach Distinction in their BTECs and other qualifications; we also want tutors to teach with distinction, using evidence-informed wisdom about effective practice to design and deliver a successful curriculum for all learners.

We designed the programme around some core texts – Doug Lemov’s Teach Like a Champion, which the college was already using,  and the excellent Didau/Rose Pyschology book, which captures so much of the research evidence in an accessible format.  We’ve also used some firm favourite resources for CPD such as the Austin’s Butterfly video, the Learning Scientists six strategies,  the Rosenshine Principles of Instruction and the Tharby/Allison Making Every Lesson Count flow diagram.

 

 

I will report back on our progress, but here is an outline of what we’re doing:  There are six teaching and learning modules:

Slide1

The programme is designed so that it follows best practice, blending external input from me with regular CPD sessions every fortnight or so in between.  This will allow each faculty to design its own tailored programme so that the common learning is interpreted in the context of the needs of learners in specific technical disciplines.  So far we’ve planned up to the end of February 2018 but it will continue beyond that:

Slide2

Part of the programme has been training for faculty leaders on running effective CPD sessions.  We’ve borrowed the structure from Dylan Wiliam’s ideas about teacher learning communities:

Slide3

Thanks To give you a flavour of the content, here are some of the unit outlines and the headline course overview:

Screen Shot 2017-06-25 at 19.09.50Screen Shot 2017-06-25 at 19.10.22

Screen Shot 2017-06-29 at 09.51.31.pngScreen Shot 2017-06-25 at 19.10.39

I’d like to say a huge thank you to Alun, Rachel, Nick, Roger and all the other members of the Oldham College team who have made me feel so welcome. The first sessions for faculty leaders went really well and I’m very excited about returning to deliver the first round of training for staff.

If you work in FE and would like to talk to me about working with you, please get in touch.

Here is the infographic produced for us by Oliver Caviglioli. Modules 1-3 infographic v2

Screen Shot 2017-07-07 at 06.59.11

Three Assessment Butterflies – A review from Michaela

 

Winston Churchill once said that ‘success is stumbling from failure to failure without losing enthusiasm.’

Looking back now on assessment in our first year at Michaela, I can now see what I was blind to then: we stumbled and blundered. What mistakes did we make, and how did we stumble?

We spent hours marking. We spent ages inputting data. And we didn’t design assessments cumulatively.

  1. Marking

First mistake: we spent exorbitant amounts of time in the first year marking, in particular marking English and History essays and paragraphs. We wrote comments, we set targets, we tried individualised icons, we corrected misspellings, we corrected grammatical errors, we judged and scored written accuracy, we wrote and shared rubrics with pupils. We spent hours every week on this. Over the year, we must have spent hundreds of hours on it.

The hidden pitfall of marking is opportunity cost. Every hour that a teacher spends marking is an hour they can’t spend on renewable resourcing: resourcing that endures for years. Marking a book is useful for one pupil once only: creating a knowledge organiser is useful for every pupil (and every teacher) that ever uses it at again. Marking is a hornet. Hornets are high-effort, low-impact; butterflies are high-impact, low-effort. Knowledge organisers are a butterfly; marking is a hornet. We had been blind to just how badly the hornet’s nest of marking was stinging us. So we cut marking altogether and now no longer mark at all.

  1. Data

Our second mistake: we spent far too much time in the first few years on data input. We typed in multiple scores for pupils that we didn’t use. Preoccupied by progress, we thought we needed as many numbers as we could get our hands on. But the simplistic equation of ‘more data, better progress’ didn’t hold up under scrutiny. Every teacher typed in multiple scores for each assessment, which were then collated so we could analyse the breakdowns. We were deluged in data, but thirsting for insight. There was far too much data to possibly act on. My muddled thinking left us mired in mediocrity, and we had invested 100s of hours for little long-term impact.

What we realised is this: data must serve teachers, rather than teachers serving data. Our axiom now is that we must only collect data that we use. There’s no point in drowning in data, or killing ourselves to input data that we don’t use.

  1. Design

Our third mistake was this: we had forgotten about forgetting. We designed end-of-unit assessments that tested what pupils had only just learned, and then congratulated ourselves when they did well whilst it was very fresh in the memory. We had pupils write essays just after they had finished the unit. We coached them to superb performances – but they were performances that they would not be able to repeat on that text in English or that period of History even a few weeks later. Certainly, months later, they wouldn’t stand a chance. Just as if you asked me to retake for Physics GCSE tomorrow, I would flunk it badly, so just one year on, our pupils would flunk the exact assessment that they had aced one year earlier.

Looking back with hindsight, these three mistakes – on marking, data and design – helped us realise our two great blind spots in assessment: workload and memory. We didn’t design our assessments with pupils’ memory and teachers’ workload in mind.

We were creating unnecessary and unhelpful workload for teachers that prevented them focusing on what matters most. Marking and data were meant to improve teaching and assessment, but assessment and teaching and had ended up being inhibited by them.

We were forgetting just how much our pupils were forgetting. Forgetting is a huge problem amongst pupils and a huge blind spot in teaching. If pupils have forgotten the Shakespeare play they were studying last year, can they really be said to have learned it properly? What if they can’t remember the causes or course of the war they studied last year in history? Learning is for nothing if it’s all forgotten.

 

The Battle of the Bridge

Assessment is the bridge between teaching and learning. There’s always a teaching-learning gap. Just because we’ve taught it, it doesn’t mean pupils have learned it. The best teachers close the teaching-learning gap so that their pupils learn – and remember rather than forget – what they are being taught. We’ve found the idea of assessment as a bridge to be a useful analogy for curriculum and exam design. Once you see assessment as a bridge, you can begin to ask new questions that generate new insights: what principles in teaching are equivalent to the laws of physics that underpin the engineering and construction of the bridge? How can we design and create a bridge that is built to endure? How can we create an assessment model that bridges the teaching-learning gap?

We’ve found 3 assessment solutions that have exciting potential. Here are the reasons I’m excited about them:

They have absolutely no cost.

They are low-effort for staff to create.

They have high impact on pupils’ learning.

They are not tech-dependent at all.

They are based on decades of scientific research.

They can be immediately implemented by any teacher on Monday morning.

They have stood the test of time at Michaela over the last three years.

I anticipate we’ll still be using them in three, six and even ten years’ time, and beyond.

In short: no cost, low effort, high impact, research-based, long-term solutions.

 

Three of the most effective assessment tools we’ve found for closing the teaching-learning gap are daily recaps, weekly quizzes and knowledge exams.

Over 100 years of scientific research evidence suggests that the testing effect has powerful impact on remembering and forgetting. If pupils are to remember and learn what we teach them in the subject curriculum, assessment must be cumulative and revisit curriculum content. The teaching-learning gap gets worse if pupils forget what they’ve learned. As cognitive science has shown, ‘if nothing has been retained in long-term memory, nothing has been learned’. Assessment, by ensuring pupils revisit what they’re learning, can help ensure they remember it.

Pupils forget very swiftly. We use daily recaps, weekly quizzes and biannual knowledge exams to boost pupils’ long-term memory retention and prevent forgetting.

 

  1. Daily recaps

Daily recaps are a butterfly: low-effort, high-impact. Departments create recap questions for every single lesson. Every single lesson starts with a recap. They are easy to resource. They consolidate pupils’ learning so they don’t forget. Every day they spend up to 20 minutes in each lesson applying what they’ve learned before. In English, for example, we spend those 20 minutes on grammar recaps, spelling recaps, vocabulary recaps, literature recaps (with questions on characters, themes, plots, devices and context). We do recaps on the unit they have been studying over the last few weeks. We do recaps on the previous unit and previous year’s units. This daily habit builds very strong retention and motivation: pupils feel motivated because they see how much they are remembering and how much more they are learning than ever before. All recaps are open questions, and weaker forms might be given clues. The recaps are always written; they are no-stakes, without any data being collected; they give instant feedback, as they are swiftly marked, corrected and improved by pupils themselves. We’ve ask pupils after: ‘hands up who got 4 out of 5? Hands up who got 5 out of 5, 100%?’ Pupils achieving 100% feel successful and motivated to work hard to revise.

 

  1. Weekly Quizzes

Weekly quizzes are a butterfly: low-effort on workload, high-impact on learning. Departments create quiz questions for every week in the school year. Every week there is a quiz in every subject. They are easy to resource. They challenge and test pupils’ understanding. They are mastery tests, where most pupils should be able to achieve a strong result.

We have dramatically, decisively simplified how teachers score them. Instead of marking every single question laboriously, teachers simply sort them into piles. They make swift judgement calls about whether each pupil’s quiz is a pass, excellent, or fail. Each judgement is a simple scan of the pupil’s quiz paper and a decision as to which of the three piles it should be in. Accuracy isn’t perfect, but nor does it need to be: there are diminishing returns to perfecting accuracy.

The data is then inputted in 30 seconds into a beautifully simple tracker. Any pupil failing often is red-flagged, so teachers can focus in lessons on pupils who are struggling. And that is the only data point that our teachers have to keep in mind: which pupils are struggling most?

 

  1. Knowledge Exams

Knowledge exams are another butterfly – high impact, low effort. What I love about our knowledge exams is that they are cumulative, so that pupils revise and remember what they’ve learned. We have exam weeks twice yearly, in January and July (not half-termly). We set GCSE-style exams for depth, and we set knowledge exams to test a much fuller breadth of the knowledge pupils have learned. Knowledge exams are 35-question exams that take 60 minutes to complete. They are beautifully simple: they are organised onto 1 sheet of A4 paper, and they can be answered by pupils on one double-sided piece of A4. The breadth we can achieve with these exams is staggering. By Year 9, we have 3 knowledge exams in History, Religion, Science and English alone; they organise 35 questions on what pupils learned in Year 7 and 35 questions on what pupils learned in Year 8, centred on those years’ knowledge organisers. Twice a year, pupils are challenged to revise and remember what they’ve learned over all the years they spent in secondary school. This means they answer 12 knowledge exams – over 400 questions in total across 4 subjects. I am willing to bet that many of our teachers could not beat even our Year 7 pupils on these exams across all subjects! Imagine more than 24 sides of A4 packed with answers from every pupil in the school. The humble knowledge exam is a great catcher of knowledge.

As for marking them? We simply sort them into three piles: excellent, pass and fail. We don’t even record the marks. Teachers just note the names of pupils who failed multiple knowledge exams so we know who’s struggled.

Knowledge exams solve the breadth-depth tradeoff in exams. They give pupils maximum practice with minimum marking burden on teachers.

Simplicity must cut through assessment complexity. We should practise what we preach on cognitive overload for teachers as well as pupils. Assessment resources must be renewable, replicable, sustainable, scalable, enduring, long-term.

And the impact of recaps, quizzes and knowledge exams? We’ve had several (very weak) Y8 or Y9 pupils miss an entire term though unavoidable long-term illness, only to return fully remembering what they’ve been taught the previous term and previous year. It’s an early indicator that the assessment strategy is bridging the teaching-learning gap and overcoming the savage forgetting curve. The real test of its impact will be GCSE results in 2019, A-level results in 2021 and University access and graduation beyond.

Blind, still

The two blind spots we’ve discovered – memory and workload – provide us with ways of interrogating our teaching and assessment practice:

  • How much are pupils remembering?
  • Where are they forgetting?
  • Where are teachers overloaded?

And I still think that we at Michaela can do more and find better ways of creating assessments with memory and workload in mind. I’m sure our pupils are not yet remembering as much as we’d like them to. I had a conversation with Jonny Porter, our Head of Humanities, just this week, about ramping up the previous-unit daily recaps we do. In this sense, even at Michaela we still feel blind on the blind spot of memory – pupils are still forgetting some of what we are teaching, and we want them to remember what they are learning for the very long-term. Our ambition is that they have learned what we’ve taught for years to come: for five, ten, twenty years.

Every day, teachers and pupils at Michaela see Churchill’s words on the wall: ‘success is never final; failure never fatal; it’s the courage that counts.’ It takes courage to radically simplify assessment – and courage to continually confront our workload and memory blind spots.

Knowledge based scheme of work

The final piece (number 3 but search the blog for the others) about how to use summative assessment in a knowledge based curriculum by Robert Peal:

Planning a knowledge-based scheme of work. Part 3: Summative Assessment

Like many teachers, I have spent the last week marking end of year exams for Key Stage 3. Having put some thought into the design of these exams, I have – perhaps for the first time – found this to be an instructive and, dare I say it, enjoyable process.

For the sake of this blog post, I am going to focus on our Year 8 exam, covering Early Modern Britain and the Age of Encounters. All our KS3 assessments share a similar format, and you can view them here, with examples from Year 7, Year 8 and Year 9.

In the past, I have struggled to find a satisfactory format for end of year exams, falling back on the unimaginative (and unhelpful) practice of mirroring GCSE examinations. Reading Daisy Christodoulou’s Making Good Progress, and talking to colleagues at the Historical Association Conference in May, helped me narrow my focus. At WLFS, the construct we want to assess in KS3 history boils down to three outcomes (four in the case of Year 9). Do pupils have:

  1. an accurate chronological framework of the period studied?
  2. a broad knowledge of the period studied?
  3. the ability to construct well-evidenced historical arguments?
  4. the ability to comment on the purpose and usefulness of historical sources? (Year 9)

Our end of year exams now mirror those outcomes. At Year 7 and 8, the exam consists of three sections:

  • Section 1: Chronology test /5 marks.
  • Section 2: Multiple choice quiz /20 marks
  • Section 3: Essay /25 marks

Section 1: Chronology test

The chronology test for Year 8 involved linking 10 events with 10 dates. The events were chosen from a list of 25 dates included in the pupils’ revision guide, spanning from 1453 to 1721. We didn’t expect pupils to memorise all of the dates listed. But if they had a good understanding of the historical narrative, and knew some of the most important dates (such as 1588 and 1688), then they would – we hoped – be able to piece together the correct answer.

Pupils gained half a mark per correct answer. As a test item, the chronology test tended towards bifurcation: in all, 46% scored 5 out of 5, but with another a large percentage clumped towards the bottom end. Next year, we need to do more to ensure a strong chronological understanding amongst all our pupils. Perhaps our pupils should memorise all 25 dates?

Section 2: Multiple choice quiz

This quizzing portion of the exam has been designed to assess the whole domain of the Year 8 curriculum, in a way that the essay question could not.

In Making Good Progress, Daisy recommends using multiple choice questions for formative assessment. Though a good idea in principle, I have found MCQs too time-consuming to create, and too cumbersome to mark, on an ongoing basis. However, for our summative end of year exam, the investment in creating and MCQs was time well spent.

Once pupils completed their exams, our department entered all of the pupil answers into a question-level analysis spreadsheet (see here), so that we could see which questions pupils struggled with, and which questions pupils breezed through. Daisy suggests this is useful for highlighting pupil misconceptions, which it was. But I did wonder whether the varying success rates for different questions was more dependent on the design of the question, rather than the quality of pupil understanding.

MCQ question level analysis

For example, this was the most challenging question for our pupils.

4. Which Catholic martyr did Henry VIII execute for refusing to give up his religion?
a. Thomas More
b. Thomas Wolsey
c. Thomas Cromwell
d. Thomas Cranmer
Success rate: 39%

The low success rate clearly has a lot to do with the proximity of the distractors: parents at the end of the fifteenth-century really liked the name ‘Thomas’.

Question 4 tested an item of declarative knowledge, but the next most challenging question for our Year 8 pupils probed their understanding on a more conceptual level, in the way that Christodoulou argues MCQs are well equipped to do.

20. How was the power of Georgian Kings further limited by ‘Parliamentary government’?
a. The king was not allowed to be a Catholic
b. The king could only choose ministers who had the support of Parliament
c. The king could not start wars without Parliament’s permission
d. Parliament had the power to appoint and dismiss the king
Success rate: 44%

The question hinged on the word ‘further’, and required pupils to discriminate between the outcomes of the Glorious Revolution in 1688, and the outcomes of the development of Parliamentary Government under George I. At the other end of the scale, the question pupils found easiest did surprise me.

17. What title was Oliver Cromwell given to rule England in 1653?
a. Lord Protector
b. King
c. Prime Minister
d. Lord Chancellor
Success rate: 98%

I thought I was on to something, as many pupils had written about Cromwell as ‘King’ during the year. But by the time of the exam, not a single pupil chose that distractor. Three did choose ‘Lord Chancellor’. Again, the question with the second highest success rate was not one I thought particularly easy when writing it:

18. What did the Bill of Rights do?
a. secured the legal rights of Parliament and limited the monarch’s power
b. banned the monarchy, and establishing England as a Commonwealth
c. gave equal political rights to all people in England
d. united England and Scotland into a single Kingdom
Success rate: 94%

But, our analysis shows this question was simply too easy, and the distractors too dissimilar. Perhaps the most helpful outcome of this question level analysis has been to hone in on which questions worked well, and which did not – allowing us to refine the writing of the MCQs in years to come.

Section 3: Essay

Lastly, we set a mini-essay, with clear instructions that pupils were to write three paragraphs: two sides of an argument and a conclusion. With around half an hour to complete the essay, this seemed like a reasonable demand.

Throughout the year, our Year 8 pupils wrote five essays. They were on Henry VIII and the Reformation; The Age of Encounters; the Later Tudors; the English Civil War; and the late Stuarts/early Georgians. We chose two essays questions from these five units, based on the same enquiry as the earlier essay question. We were not interested in tripping up pupils with fiendishly difficult questions. Rather, we wanted straightforward essay questions that gave pupils the best chance of marshalling their knowledge to support a reasoned historical argument. The two questions pupils had the choice of answering were:

  1. ‘The invention of the Printing Press was the most important event that took place in Early Modern Europe.’ To what extent do you agree with this statement?
  2. ‘Charles I only had himself to blame for his execution in 1649’. To what extent do you agree with this statement?

To mark the essays, individual teachers grouped them according to whether they were A* to E in quality. We then met as a department, saw how consistent the judgements were, made some adjustments, and assigned a numerical mark out of 25 to each script. It was a low tech version of comparative judgement, which seemed to work pretty well.

The correlation between pupil outcomes in the multiple choice questions, with pupil outcomes in the essay, was 0.7. Most helpfully, this highlights for our department those pupils who understanding what we study, but still struggle with written work.

Correlation

Next year, we will use a selection of this year’s essays as exemplification material for each grade band, replacing the need for a mark scheme.

So that you can see how WLFS pupils are getting on with a knowledge-based curriculum, here are the Year 8 exemplification essays we will use. Each grade band contains three exemplar essays. Having been written under timed conditions, during exam week, on an unknown question, the quality of did take a dip compared with the essays pupils have written throughout the year. However, I was still pleased with the way in which pupils were able to organise their knowledge into convincing historical arguments.

There is still much to work on (particularly on the explicit teaching of different lines of argument – more to follow), but I am happy that our KS3 curriculum is now equipping pupils with a deep well of powerful knowledge to inform their historical thinking.

A star-grade exemplars

A-grade exemplars

B-grade exemplars

C-grade exemplars

HoDs – how to improve teaching in your departments…

A useful overview by Improving Teaching:

Improving teaching and learning: ideas for heads of department

There’s a good case to be made that better teaching and learning is best achieved by departments.  Some things can only be solved at a whole-school level, such as behaviour; others, like lesson planning, can perhaps best be addressed by individual teachers.  But it is the department which influences teaching and learning most (Aubrey-Hopkins and James, 2002); it is departments which become the focus for improvement as a school improves (Chapman, 2004).  Teachers of different subjects think and interact in different ways (Grossman, Wineburg and Woolworth, 2001; Spillane, 2005): the shared practice of their discipline makes departments distinct “communities of practice (Harris, 2001; Wenger, 2000, p.229).”  Professional learning communities, collegial bodies improving teaching and learning, are usually found in departments (McLaughlin and Talbert, 2001).  So how can departments go about improving teaching and learning?

How: the department as professional learning community

Collective responses to the fundamental challenges facing teachers – What to teach?  How best to teach it? – are more powerful.  It is in the department where the requisite expertise can be shared (Aubrey-Hopkins and James, 2002):

First, it is assumed that knowledge is situated in the day-to-day lived experiences of teachers and best understood through critical reflection with others who share the same experience (Buysse, Sparkman, & Wesley, 2003). Second, it is assumed that actively engaging teachers in [professional learning communities] will increase their professional knowledge and enhance student learning (Vescio, Ross and Adams, 2007).”

Departments in which students learn more tend to collegiality, relational trust, teacher learning, shared decision making and a culture of collaboration in which practice is ‘deprivatised’ (Bubb and Earley, 2004; Bryk and Schneider, 2002; Vescio, Ross and Adams, 2007).  Perhaps it is unsurprising therefore that:

The use of professional learning communities as a means to improve teaching practice and student achievement is a move that educators support and value, as indicated by teachers’ perceptions of impact (Vescio, Ross and Adams, 2007).”

Harnessing teachers’ collective knowledge and experience should improve student learning in the present and help teachers improve in the longer-term: but what is the focus for this collegiality to be?

Professional learning towards what?

Collegial communities are only useful if we know what we want (Wiliam, 2007).  Departments can work for and against change (Brown et al, 2000; McLaughlin and Talbert, 2001), thus while a head of department may need to develop collegiality (Harris, 2004), they also need to focus on core goals (Spillane, 2005) and maintain coherence (Sergiovanni, 2005).  Dylan Wiliam has argued that:

“Leaders who are serious about improving the outcomes for students in their schools have to develop the use of formative assessment, both retrospectively, as a way of ensuring that students do not fall behind, and also prospectively, as a way of increasing the pedagogical skills of teachers in the school (2016: p.126).”

A review of effective professional learning communities found one feature stood out:

A persistent focus on student learning and achievement by the teachers in the learning communities.  All eight studies documented that the collaborative efforts of teachers were focused on meeting the learning needs of their students (Vescio, Ross and Adams, 2007).”

Conversely:

In the communities where teachers worked together but did not engage in structured work that was highly focused around student learning, similar gains were not evident (Vescio, Ross and Adams, 2007).”

A study of one such professional learning community community found that its power lay in the use of assessment to connect “the instructional choices that teachers make and the learning outcomes of students.”  This “helped teachers reflect on their instructional approaches and gain insight into the levels of understanding of their students”, and led to changes in their teaching – as identified by external observers – and small, but statistically significant, improvements in student learning (Supovitz, 2013).  Another study contrasted teachers meeting to discuss teaching and meeting to discuss student work: teachers discussed teaching at length, but this left little chance for critical discussion or insight; when discussing student work, teachers:

Were constantly monitoring the extent to which there were connections between students’ overarching, long-term learning goals, the materials used to assess these goals, and students’ related performance (Popp and Goldman, 2016).”

Putting this into practice

How best to achieve this depends on a department’s staff, its existing resources (particularly curriculum) and the time available.  Three approaches stand out:

Collaborating over what to teach

Fundamental questions can be addressed more productively by subject teams.  Individual teachers have a range of subject knowledge for teaching and of experiences; the challenge is in creating a structure in which to share this productively.  A department can be asked to create a collective resource for a unit by debating and agreeing:
• The critical knowledge to learn
• Common student misconceptions
• Useful images, sources and representations to convey key points
• Links to be made to other topics (revision and foreshadowing future learning)
• Effective ways to sequence learning
This allows every teacher to share their knowledge and experience, but avoids creating a straitjacket, since teachers can use the resulting resource flexibly, to suit them and their classes.  (For more on these ideas as a basis for unit planning, and for a template, visit this post).

Collaborating over how to improve teaching

The aim is not to force teachers to teach identically, but to catalyse reflection by individual teachers and the sharing of effective approaches.  There are two stages:

  1. Creating a common measure
    Collective reflection requires something in common, which every teacher can reflect upon.  The obvious answer would seem to be exam reviews, but for the reasons advanced by Daisy Christodoulou (2017), this is unlikely to be particularly helpful, because a summative assessment (or mock) tells you very little about where the gaps in students’ knowledge actually lie: a student may struggle with a specific question for a dozen reasons.  Instead I’d suggest exit tickets and multiple-choice questions (more below).
  2. Collective reflection
    With these shared tools we can examine the variation in student responses and try to explain how it has come about:  How did each teacher explain the topic?  How did they allocate time differently?  What metaphors did they use when students became stuck?  By beginning with a question: why did some students answer this question well, others poorly? teachers can “move beyond merely sharing what happened in lessons to critical reflection on the teaching-learning process (Popp and Goldman, 2016).”  Teachers’ ability to contrast their approaches with those of their colleagues should allow them to reflect more carefully and more productively, leaving them open to adopt productive ideas willingly.

Two techniques lend themselves to this in particular:

Collective review of an agreed piece of student work.

Teachers can design half a dozen common exit tickets for a unit, then compare what students learned (or, if not exit tickets, any agreed piece of student work).  Collective analysis of exit tickets should lead to fruitful discussions, focusing on what each teacher did differently the effect this had on student learning.  Question prompts might include:

  • Where did most students struggle?
  • What did most students manage well?
  • How do student answers differ between classes?

This helps teachers  focus upon “the substance represented by the data” and hence “reflect on… instructional approaches and gain insight into the levels of understanding of their students” rather than diverting them “to acquire new analytic skills to make sense of the data (Supovitz, 2013).”

Create, use and analyse multiple-choice questions

Collectively developed multiple-choice questions have a range of functions.  Designing good multiple-choice questions is time-consuming and relies on good knowledge of student misconceptions and how students might interpret the questions. Collectively designing half a dozen multiple-choice questions spreads the work involved and allows teachers to share their knowledge of misconceptions.  These questions can be used as hinge questions within the lesson, or as exit tickets, or at any other stage in the learning process (Millar and Hames, 2003 show how a range of ways teachers have used multiple-choice questions effectively).  Reflection afterwards could lead to similar discussions to those conducted with exit tickets, and could also allow the revision and extension of the questions, creating a growing collection.

Making it work

This approach to improving teaching and learning in departments fulfills many of the requirements proposed by the literature on effective professional development (for example: Cordingley et al., 2015; Desimone, 2009; Timperley, 2008), including active learning, a collegial approach and a focus on subject knowledge.  The remaining proposed requirements include ensuring that the approach is:
• Sustained
• Surfaces existing beliefs
• Has (and keeps) leadership support.

If you’re doing something like this now, I’d be fascinated to read or hear more.

What to read next?

References

Aubrey-Hopkins, Judith and James, Chris (2002) ‘Improving Practice in Subject Departments: the experience of secondary school subject leaders in Wales’, School Leadership & Management, 22: 3, 305 — 320

Brown, Marie , Rutherford, Desmond and Boyle, Bill(2000) ‘Leadership for School Improvement: The Role of the Head of Department in UK Secondary Schools’, School Effectiveness and School Improvement, 11: 2, 237 —258

Bryk, A. and Schneider, B. (2002). Trust in schools. New York: Russell Sage Foundation.

Bubb, S. and Earley, P. (2004), ‘Why is managing change not easy?’ Managing Teacher Workload: Workload and Wellbeing, London: PCP/Sage

Chapman, Christopher (2004) ‘Leadership for Improvement in Urban and Challenging Contexts’, London Review of Education, 2: 2, 95 — 108

Christodoulou, D. (2017) Making Good Progress: The Future of Assessment for Learning. Oxford, OUP.

Cordingley, P., Higgins, S., Greany, T., Buckler, N., Coles-Jordan, D., Crisp, B., Saunders, L., Coe, R. (2015) Developing Great Teaching: Lessons from the international reviews into effective professional development. Teacher Development Trust.

Desimone, L. (2009) Improving Impact Studies of Teachers’ Professional Development: Toward better Conceptualizations and Measures. Education Researcher 38(3) 181-199

Grossman, P., Wineburg, S., & Woolworth, S. (2001). Toward a Theory of Teacher Community. The Teachers College Record, 103, 942-1012.

Harris, Alma (2001) ‘Department Improvement and School Improvement: A missing link?’, British Educational Research Journal, 27: 4, 477 — 486

Harris, Alma (2004) ‘Distributed Leadership and School Improvement : Leading or Misleading?’ Educational Management Administration & Leadership, 32: 11-26

Popp, J. and Goldman, S. (2016). Knowledge building in teacher professional learning communities: Focus of meeting matters. Teaching and Teacher Education, 59, pp.347-359.

McLaughlin, M. and Talbert, J. (2001). Professional communities and the work of high school teaching. Chicago: University of Chicago Press.

Millar, R., Hames, V. (2003) Using Diagnostic Assessment to Enhance Teaching and Learning: A Study of the Impact of Research-informed Teaching Materials on Science Teachers’ Practices. Evidence-based Practice in Science Education (EPSE) Research Network.

Sergiovanni, Thomas (2003), ‘A Cognitive Approach to Leadership’, in Brent Davies and John West-Burnham (ed.), Handbook of Educational Leadership and Management, Pearson: London, Ch. 2

Spillane, J. P. (2005) Primary school leadership practice: how the subject matters, School Leadership & Management, 25: 4, 383-397

Supovitz, J. (2013) The Linking Study: An Experiment to Strengthen Teachers’ Engagement With Data on Teaching and Learning. CPRE Working Papers.

Timperley, H. (2008). Teacher professional learning and development. Educational Practices (18). International Academy of Education.

Vescio, V., Ross, D., Adams, A. (2007) A review of research on the impact of professional learning communities on teaching practice and student learning. Teaching and Teacher Education 24. 80–91.

Wenger, Etienne (2000), ‘Communities of Practice and Social Learning Systems’, Organization, 7: 225-247

Wiliam, D. (2007) Content then process: teacher learning communities in the service of formative assessment. In D. B. Reeves (Ed.), Ahead of the curve: the power of assessment to transform teaching and learning (pp. 183-204). Bloomington, IN: Solution Tree.

Wiliam, D. (2016) Leadership for Teacher Learning: Creating a Culture Where All Teachers Improve So That All Students Succeed. Learning Sciences International.

Pragmatic Education

*Ideas are the currency of the 21st century*

The Headteacher's Blog

Churchill Academy and Sixth Form

David Didau: The Learning Spy

Brain food for the thinking teacher

Excellence & Growth Schools Network

Working harder makes you smarter

Belmont Teach

...our directory of excellence

The Confident Teacher

Developing successful habits of mind, body and pedagogy.

@LeadingLearner

Fascinated by leading and learning

johntomsett

"There is nothing either good or bad, but thinking makes it so."

Class Teaching

Finding & sharing teaching 'bright spots'

Laura McInerney

Education Writer, Researcher & Policy Nerd.

@TeacherToolkit

Most Influential Blog on Education in the UK

kevenbartle's Blog

I want a life that's bigger than me!

Full On Learning

Because learning is too important to be left to chance

Creative T&L

by jkfairclough

teacherhead

Zest for Learning... into the rainforest of teaching and school leadership

Scenes From The Battleground

Teaching in British schools

Teaching: Leading Learning

Reflections on Education: a blog by Chris Hildrew

growmindsets

How to implement a growth mindset culture......