A really interesting blog from Tom Sherrington about teaching for distinction:
Teaching for Distinction @OldhamCollege
The most exciting job I’ve had since starting out with Teacherhead Consulting has been working with Oldham College. Principal Alun Francis approached me to explore whether there was scope in applying current thinking around teaching and learning, curriculum planning and my experience of the delivery of CPD in schools to the FE setting. He was keen to move away from the one-off CPD day where the impact can be marginal. He was also keen to explore the idea of a ‘powerful knowledge’ curriculum in FE, not least because so many technical and vocational courses are moving to include examined components.
Working with Alun and Rachel Irving, the Head of Teaching and Learning, I spent a few days in Oldham talking to a range of members of staff – tutors, Heads of Faculty, members of the senior leadership team. I observed a few lessons and got a feel for the context. FE is radically different to school in some ways: there are huge plumbing workshops, rows of painting and decorating booths, hair and beauty salons, design studios. Oldham even has the front half of an aeroplane parked outside. Some students are engaged on 100% work-based learning programmes where they receive visits from assessors. The Maths and English departments are dealing with over 1000 students all taking the same GCSE resit course. As well as the scale, the language is different: it’s all about learners, tutors and ‘quals’.
But, the basic business of teaching students so that they succeed is the same. Students need structure, guidance, support, quality instruction, high expectations, feedback, chances to improve. The also need to acquire knowledge. Teaching for Distinction has a clear double meaning. We want students to reach Distinction in their BTECs and other qualifications; we also want tutors to teach with distinction, using evidence-informed wisdom about effective practice to design and deliver a successful curriculum for all learners.
We designed the programme around some core texts – Doug Lemov’s Teach Like a Champion, which the college was already using, and the excellent Didau/Rose Pyschology book, which captures so much of the research evidence in an accessible format. We’ve also used some firm favourite resources for CPD such as the Austin’s Butterfly video, the Learning Scientists six strategies, the Rosenshine Principles of Instruction and the Tharby/Allison Making Every Lesson Count flow diagram.
I will report back on our progress, but here is an outline of what we’re doing: There are six teaching and learning modules:
The programme is designed so that it follows best practice, blending external input from me with regular CPD sessions every fortnight or so in between. This will allow each faculty to design its own tailored programme so that the common learning is interpreted in the context of the needs of learners in specific technical disciplines. So far we’ve planned up to the end of February 2018 but it will continue beyond that:
Part of the programme has been training for faculty leaders on running effective CPD sessions. We’ve borrowed the structure from Dylan Wiliam’s ideas about teacher learning communities:
Thanks To give you a flavour of the content, here are some of the unit outlines and the headline course overview:
I’d like to say a huge thank you to Alun, Rachel, Nick, Roger and all the other members of the Oldham College team who have made me feel so welcome. The first sessions for faculty leaders went really well and I’m very excited about returning to deliver the first round of training for staff.
If you work in FE and would like to talk to me about working with you, please get in touch.
Here is the infographic produced for us by Oliver Caviglioli. Modules 1-3 infographic v2
The final piece (number 3 but search the blog for the others) about how to use summative assessment in a knowledge based curriculum by Robert Peal:
Planning a knowledge-based scheme of work. Part 3: Summative Assessment
Like many teachers, I have spent the last week marking end of year exams for Key Stage 3. Having put some thought into the design of these exams, I have – perhaps for the first time – found this to be an instructive and, dare I say it, enjoyable process.
For the sake of this blog post, I am going to focus on our Year 8 exam, covering Early Modern Britain and the Age of Encounters. All our KS3 assessments share a similar format, and you can view them here, with examples from Year 7, Year 8 and Year 9.
In the past, I have struggled to find a satisfactory format for end of year exams, falling back on the unimaginative (and unhelpful) practice of mirroring GCSE examinations. Reading Daisy Christodoulou’s Making Good Progress, and talking to colleagues at the Historical Association Conference in May, helped me narrow my focus. At WLFS, the construct we want to assess in KS3 history boils down to three outcomes (four in the case of Year 9). Do pupils have:
- an accurate chronological framework of the period studied?
- a broad knowledge of the period studied?
- the ability to construct well-evidenced historical arguments?
- the ability to comment on the purpose and usefulness of historical sources? (Year 9)
Our end of year exams now mirror those outcomes. At Year 7 and 8, the exam consists of three sections:
- Section 1: Chronology test /5 marks.
- Section 2: Multiple choice quiz /20 marks
- Section 3: Essay /25 marks
Section 1: Chronology test
The chronology test for Year 8 involved linking 10 events with 10 dates. The events were chosen from a list of 25 dates included in the pupils’ revision guide, spanning from 1453 to 1721. We didn’t expect pupils to memorise all of the dates listed. But if they had a good understanding of the historical narrative, and knew some of the most important dates (such as 1588 and 1688), then they would – we hoped – be able to piece together the correct answer.
Pupils gained half a mark per correct answer. As a test item, the chronology test tended towards bifurcation: in all, 46% scored 5 out of 5, but with another a large percentage clumped towards the bottom end. Next year, we need to do more to ensure a strong chronological understanding amongst all our pupils. Perhaps our pupils should memorise all 25 dates?
Section 2: Multiple choice quiz
This quizzing portion of the exam has been designed to assess the whole domain of the Year 8 curriculum, in a way that the essay question could not.
In Making Good Progress, Daisy recommends using multiple choice questions for formative assessment. Though a good idea in principle, I have found MCQs too time-consuming to create, and too cumbersome to mark, on an ongoing basis. However, for our summative end of year exam, the investment in creating and MCQs was time well spent.
Once pupils completed their exams, our department entered all of the pupil answers into a question-level analysis spreadsheet (see here), so that we could see which questions pupils struggled with, and which questions pupils breezed through. Daisy suggests this is useful for highlighting pupil misconceptions, which it was. But I did wonder whether the varying success rates for different questions was more dependent on the design of the question, rather than the quality of pupil understanding.
For example, this was the most challenging question for our pupils.
4. Which Catholic martyr did Henry VIII execute for refusing to give up his religion?
a. Thomas More
b. Thomas Wolsey
c. Thomas Cromwell
d. Thomas Cranmer
Success rate: 39%
The low success rate clearly has a lot to do with the proximity of the distractors: parents at the end of the fifteenth-century really liked the name ‘Thomas’.
Question 4 tested an item of declarative knowledge, but the next most challenging question for our Year 8 pupils probed their understanding on a more conceptual level, in the way that Christodoulou argues MCQs are well equipped to do.
20. How was the power of Georgian Kings further limited by ‘Parliamentary government’?
a. The king was not allowed to be a Catholic
b. The king could only choose ministers who had the support of Parliament
c. The king could not start wars without Parliament’s permission
d. Parliament had the power to appoint and dismiss the king
Success rate: 44%
The question hinged on the word ‘further’, and required pupils to discriminate between the outcomes of the Glorious Revolution in 1688, and the outcomes of the development of Parliamentary Government under George I. At the other end of the scale, the question pupils found easiest did surprise me.
17. What title was Oliver Cromwell given to rule England in 1653?
a. Lord Protector
c. Prime Minister
d. Lord Chancellor
Success rate: 98%
I thought I was on to something, as many pupils had written about Cromwell as ‘King’ during the year. But by the time of the exam, not a single pupil chose that distractor. Three did choose ‘Lord Chancellor’. Again, the question with the second highest success rate was not one I thought particularly easy when writing it:
18. What did the Bill of Rights do?
a. secured the legal rights of Parliament and limited the monarch’s power
b. banned the monarchy, and establishing England as a Commonwealth
c. gave equal political rights to all people in England
d. united England and Scotland into a single Kingdom
Success rate: 94%
But, our analysis shows this question was simply too easy, and the distractors too dissimilar. Perhaps the most helpful outcome of this question level analysis has been to hone in on which questions worked well, and which did not – allowing us to refine the writing of the MCQs in years to come.
Section 3: Essay
Lastly, we set a mini-essay, with clear instructions that pupils were to write three paragraphs: two sides of an argument and a conclusion. With around half an hour to complete the essay, this seemed like a reasonable demand.
Throughout the year, our Year 8 pupils wrote five essays. They were on Henry VIII and the Reformation; The Age of Encounters; the Later Tudors; the English Civil War; and the late Stuarts/early Georgians. We chose two essays questions from these five units, based on the same enquiry as the earlier essay question. We were not interested in tripping up pupils with fiendishly difficult questions. Rather, we wanted straightforward essay questions that gave pupils the best chance of marshalling their knowledge to support a reasoned historical argument. The two questions pupils had the choice of answering were:
- ‘The invention of the Printing Press was the most important event that took place in Early Modern Europe.’ To what extent do you agree with this statement?
- ‘Charles I only had himself to blame for his execution in 1649’. To what extent do you agree with this statement?
To mark the essays, individual teachers grouped them according to whether they were A* to E in quality. We then met as a department, saw how consistent the judgements were, made some adjustments, and assigned a numerical mark out of 25 to each script. It was a low tech version of comparative judgement, which seemed to work pretty well.
The correlation between pupil outcomes in the multiple choice questions, with pupil outcomes in the essay, was 0.7. Most helpfully, this highlights for our department those pupils who understanding what we study, but still struggle with written work.
Next year, we will use a selection of this year’s essays as exemplification material for each grade band, replacing the need for a mark scheme.
So that you can see how WLFS pupils are getting on with a knowledge-based curriculum, here are the Year 8 exemplification essays we will use. Each grade band contains three exemplar essays. Having been written under timed conditions, during exam week, on an unknown question, the quality of did take a dip compared with the essays pupils have written throughout the year. However, I was still pleased with the way in which pupils were able to organise their knowledge into convincing historical arguments.
There is still much to work on (particularly on the explicit teaching of different lines of argument – more to follow), but I am happy that our KS3 curriculum is now equipping pupils with a deep well of powerful knowledge to inform their historical thinking.
A useful overview by Improving Teaching:
Improving teaching and learning: ideas for heads of department
There’s a good case to be made that better teaching and learning is best achieved by departments. Some things can only be solved at a whole-school level, such as behaviour; others, like lesson planning, can perhaps best be addressed by individual teachers. But it is the department which influences teaching and learning most (Aubrey-Hopkins and James, 2002); it is departments which become the focus for improvement as a school improves (Chapman, 2004). Teachers of different subjects think and interact in different ways (Grossman, Wineburg and Woolworth, 2001; Spillane, 2005): the shared practice of their discipline makes departments distinct “communities of practice (Harris, 2001; Wenger, 2000, p.229).” Professional learning communities, collegial bodies improving teaching and learning, are usually found in departments (McLaughlin and Talbert, 2001). So how can departments go about improving teaching and learning?
How: the department as professional learning community
Collective responses to the fundamental challenges facing teachers – What to teach? How best to teach it? – are more powerful. It is in the department where the requisite expertise can be shared (Aubrey-Hopkins and James, 2002):
First, it is assumed that knowledge is situated in the day-to-day lived experiences of teachers and best understood through critical reflection with others who share the same experience (Buysse, Sparkman, & Wesley, 2003). Second, it is assumed that actively engaging teachers in [professional learning communities] will increase their professional knowledge and enhance student learning (Vescio, Ross and Adams, 2007).”
Departments in which students learn more tend to collegiality, relational trust, teacher learning, shared decision making and a culture of collaboration in which practice is ‘deprivatised’ (Bubb and Earley, 2004; Bryk and Schneider, 2002; Vescio, Ross and Adams, 2007). Perhaps it is unsurprising therefore that:
The use of professional learning communities as a means to improve teaching practice and student achievement is a move that educators support and value, as indicated by teachers’ perceptions of impact (Vescio, Ross and Adams, 2007).”
Harnessing teachers’ collective knowledge and experience should improve student learning in the present and help teachers improve in the longer-term: but what is the focus for this collegiality to be?
Professional learning towards what?
Collegial communities are only useful if we know what we want (Wiliam, 2007). Departments can work for and against change (Brown et al, 2000; McLaughlin and Talbert, 2001), thus while a head of department may need to develop collegiality (Harris, 2004), they also need to focus on core goals (Spillane, 2005) and maintain coherence (Sergiovanni, 2005). Dylan Wiliam has argued that:
“Leaders who are serious about improving the outcomes for students in their schools have to develop the use of formative assessment, both retrospectively, as a way of ensuring that students do not fall behind, and also prospectively, as a way of increasing the pedagogical skills of teachers in the school (2016: p.126).”
A review of effective professional learning communities found one feature stood out:
A persistent focus on student learning and achievement by the teachers in the learning communities. All eight studies documented that the collaborative efforts of teachers were focused on meeting the learning needs of their students (Vescio, Ross and Adams, 2007).”
In the communities where teachers worked together but did not engage in structured work that was highly focused around student learning, similar gains were not evident (Vescio, Ross and Adams, 2007).”
A study of one such professional learning community community found that its power lay in the use of assessment to connect “the instructional choices that teachers make and the learning outcomes of students.” This “helped teachers reflect on their instructional approaches and gain insight into the levels of understanding of their students”, and led to changes in their teaching – as identified by external observers – and small, but statistically significant, improvements in student learning (Supovitz, 2013). Another study contrasted teachers meeting to discuss teaching and meeting to discuss student work: teachers discussed teaching at length, but this left little chance for critical discussion or insight; when discussing student work, teachers:
Were constantly monitoring the extent to which there were connections between students’ overarching, long-term learning goals, the materials used to assess these goals, and students’ related performance (Popp and Goldman, 2016).”
Putting this into practice
How best to achieve this depends on a department’s staff, its existing resources (particularly curriculum) and the time available. Three approaches stand out:
Collaborating over what to teach
Fundamental questions can be addressed more productively by subject teams. Individual teachers have a range of subject knowledge for teaching and of experiences; the challenge is in creating a structure in which to share this productively. A department can be asked to create a collective resource for a unit by debating and agreeing:
• The critical knowledge to learn
• Common student misconceptions
• Useful images, sources and representations to convey key points
• Links to be made to other topics (revision and foreshadowing future learning)
• Effective ways to sequence learning
This allows every teacher to share their knowledge and experience, but avoids creating a straitjacket, since teachers can use the resulting resource flexibly, to suit them and their classes. (For more on these ideas as a basis for unit planning, and for a template, visit this post).
Collaborating over how to improve teaching
The aim is not to force teachers to teach identically, but to catalyse reflection by individual teachers and the sharing of effective approaches. There are two stages:
- Creating a common measure
Collective reflection requires something in common, which every teacher can reflect upon. The obvious answer would seem to be exam reviews, but for the reasons advanced by Daisy Christodoulou (2017), this is unlikely to be particularly helpful, because a summative assessment (or mock) tells you very little about where the gaps in students’ knowledge actually lie: a student may struggle with a specific question for a dozen reasons. Instead I’d suggest exit tickets and multiple-choice questions (more below).
- Collective reflection
With these shared tools we can examine the variation in student responses and try to explain how it has come about: How did each teacher explain the topic? How did they allocate time differently? What metaphors did they use when students became stuck? By beginning with a question: why did some students answer this question well, others poorly? teachers can “move beyond merely sharing what happened in lessons to critical reflection on the teaching-learning process (Popp and Goldman, 2016).” Teachers’ ability to contrast their approaches with those of their colleagues should allow them to reflect more carefully and more productively, leaving them open to adopt productive ideas willingly.
Two techniques lend themselves to this in particular:
Collective review of an agreed piece of student work.
Teachers can design half a dozen common exit tickets for a unit, then compare what students learned (or, if not exit tickets, any agreed piece of student work). Collective analysis of exit tickets should lead to fruitful discussions, focusing on what each teacher did differently the effect this had on student learning. Question prompts might include:
- Where did most students struggle?
- What did most students manage well?
- How do student answers differ between classes?
This helps teachers focus upon “the substance represented by the data” and hence “reflect on… instructional approaches and gain insight into the levels of understanding of their students” rather than diverting them “to acquire new analytic skills to make sense of the data (Supovitz, 2013).”
Create, use and analyse multiple-choice questions
Collectively developed multiple-choice questions have a range of functions. Designing good multiple-choice questions is time-consuming and relies on good knowledge of student misconceptions and how students might interpret the questions. Collectively designing half a dozen multiple-choice questions spreads the work involved and allows teachers to share their knowledge of misconceptions. These questions can be used as hinge questions within the lesson, or as exit tickets, or at any other stage in the learning process (Millar and Hames, 2003 show how a range of ways teachers have used multiple-choice questions effectively). Reflection afterwards could lead to similar discussions to those conducted with exit tickets, and could also allow the revision and extension of the questions, creating a growing collection.
Making it work
This approach to improving teaching and learning in departments fulfills many of the requirements proposed by the literature on effective professional development (for example: Cordingley et al., 2015; Desimone, 2009; Timperley, 2008), including active learning, a collegial approach and a focus on subject knowledge. The remaining proposed requirements include ensuring that the approach is:
• Surfaces existing beliefs
• Has (and keeps) leadership support.
If you’re doing something like this now, I’d be fascinated to read or hear more.
What to read next?
- What makes effective professional development for a school?
- A classroom teacher’s guide to formative assessment
- Using exit tickets to assess and plan: the tuning fork of teaching
Aubrey-Hopkins, Judith and James, Chris (2002) ‘Improving Practice in Subject Departments: the experience of secondary school subject leaders in Wales’, School Leadership & Management, 22: 3, 305 — 320
Brown, Marie , Rutherford, Desmond and Boyle, Bill(2000) ‘Leadership for School Improvement: The Role of the Head of Department in UK Secondary Schools’, School Effectiveness and School Improvement, 11: 2, 237 —258
Bryk, A. and Schneider, B. (2002). Trust in schools. New York: Russell Sage Foundation.
Bubb, S. and Earley, P. (2004), ‘Why is managing change not easy?’ Managing Teacher Workload: Workload and Wellbeing, London: PCP/Sage
Chapman, Christopher (2004) ‘Leadership for Improvement in Urban and Challenging Contexts’, London Review of Education, 2: 2, 95 — 108
Cordingley, P., Higgins, S., Greany, T., Buckler, N., Coles-Jordan, D., Crisp, B., Saunders, L., Coe, R. (2015) Developing Great Teaching: Lessons from the international reviews into effective professional development. Teacher Development Trust.
Harris, Alma (2001) ‘Department Improvement and School Improvement: A missing link?’, British Educational Research Journal, 27: 4, 477 — 486
Harris, Alma (2004) ‘Distributed Leadership and School Improvement : Leading or Misleading?’ Educational Management Administration & Leadership, 32: 11-26
Popp, J. and Goldman, S. (2016). Knowledge building in teacher professional learning communities: Focus of meeting matters. Teaching and Teacher Education, 59, pp.347-359.
McLaughlin, M. and Talbert, J. (2001). Professional communities and the work of high school teaching. Chicago: University of Chicago Press.
Millar, R., Hames, V. (2003) Using Diagnostic Assessment to Enhance Teaching and Learning: A Study of the Impact of Research-informed Teaching Materials on Science Teachers’ Practices. Evidence-based Practice in Science Education (EPSE) Research Network.
Sergiovanni, Thomas (2003), ‘A Cognitive Approach to Leadership’, in Brent Davies and John West-Burnham (ed.), Handbook of Educational Leadership and Management, Pearson: London, Ch. 2
Spillane, J. P. (2005) Primary school leadership practice: how the subject matters, School Leadership & Management, 25: 4, 383-397
Wenger, Etienne (2000), ‘Communities of Practice and Social Learning Systems’, Organization, 7: 225-247
Wiliam, D. (2007) Content then process: teacher learning communities in the service of formative assessment. In D. B. Reeves (Ed.), Ahead of the curve: the power of assessment to transform teaching and learning (pp. 183-204). Bloomington, IN: Solution Tree.