De-stressing testing: low stakes, modelled and purposeful

Observing education from outside, and especially the narratives around SATs and GCSEs can lead new and early-career teachers to want to reject and reduce the amount of testing students have to endure. This was my first reaction upon entering teaching, I wanted to scrap assessment altogether and teach students in a holistic way where they didn’t feel the pressure or stress of constant testing.

What I have since come to find is that the stress around testing is not intrinsically linked to the idea of assessment but rather to the culture we put on testing in our classrooms and the quality of our teaching in order to prepare students to master assessment. In Making Good Progress? Daisy Christodoulou writes that formative assessment “doesn’t just help measure understanding; it helps develop understanding”. That is a statement that took me a while to come round to as I developed my own pedagogy but it’s one I can now see playing out in my classroom.

In my view we can reduce student stress and anxiety around assessment in three major ways: regular low-stakes, high success testing, modelling to increase mastery and independence and making assessments purposeful through feedback. 

Regular low-stakes, high success testing

2019-03-21 20_01_09
One of my Year 9 mid-year throwaway tests

Whether it’s on a whiteboard, a scrap of paper, in the back of books or on disposable test sheets I think that introducing regular, low-stakes, multiple-choice testing has been one of the biggest revolutions in my classroom. In Making Every Lesson Count, Shaun Allison and Andy Tharby describe how we have a tendency as teachers to ruminate on the macro over the micro, getting students to think about entire exam papers rather than focusing on finessing small, detailed skills. A multiple-choice test can really zero in on those micro areas of knowledge and quickly dismiss misconceptions.

It is often more useful to first spend time on the small details that contribute cumulatively to an answer. If not, students can embed huge misunderstandings into their thinking that then become difficult to reverse.

So whilst I do think there is a need to move to the macro (in the form of understanding concepts in RE), I also think that a focus on the micro is not only an effective way of practicing application of the macro but also vital to avoid embedding wrong information. To make these tests worthwhile they need to be low stakes and they need to have a high success rate.

Rosenshine in Principles of Instruction, says we should aim for 80% success rates which is something to bear in mind when writing the tests. To make them low stakes is easy, just go around with a bin or the whiteboard erasers after the test, if students are proud they can take a photo or keep their paper. You might worry that students won’t see progress if they can’t see their previous scores but if you are aiming to keep the success rate at the same level you won’t be able to make direct comparisons anyway.

Modelling to increase mastery and independence

One of the six main principles in Making Every Lesson Count is modelling. In some schools modelling is a deeply embedded practice, I think of a school I trained in in Oxford where modelling was second-nature to teachers and students across every department and I think of another school I trained in where I was told off for modelling answers because it was ‘spoon-feeding’ the students. In my own practice I fall more into the former category seeing modelling not as spoon-feeding but as essential to building mastery.

To my mind nothing is more liable to induce huge amounts of assessment-related stress than to run through the structure of a GCSE paper with Year 9 and then expect them to complete one with no modelling. Allison and Tharby provide this really helpful guide on the process from modelling to independence. It’s a process I’ve followed with my mid-and-low-ability Year 9s this year and I can tell you: it works.

2019-03-31 10_37_53
adapted from Making Every Lesson Count

Modelling doesn’t just mean giving students a perfect 12/12 mark answer and saying ‘do that’. There are a whole bunch of different methods to try. There’s Live Modelling where you narrate your own thought process and explain your mistakes and successes writing an answer in front of the class. There’s sharing the good work of other students, there’s providing work with deliberate errors and mistakes for students to correct and, a little further down the line, there’s co-construction where students work together with the teacher to create a model.

All of this naturally leads to a reduction of student stress and anxiety around assessment (both summative and formative) because of the undeniable fact that modelling aids independence and independence builds confidence and mastery.

Making assessments purposeful through feedback

This really goes without saying but I would be remiss if I didn’t say it: assessment has to be followed up on and that follow-up has to be meaningful and purposeful. Stress will occur if students don’t understand why they’ve failed to master a skill or what errors they’ve made and how to avoid them in the future. We, as a profession, need to move away from seeing assessment as a way of generating “progress data”, even the government is openly admitting that the use of data across teaching is deeply flawed. As Daisy Christodoulou says: “not only are absolute judgements based on rubrics unreliable, but the existence of the rubric has damaging consequences for teaching”.

For me feedback is an area I am giving a lot of thought to at the moment. I know I want to try and improve my own practice in that area and I am doing a lot of reading in order to do that. Given that is the case I won’t try and espouse one method over another but I will say that when students say they don’t understand where they’ve gone wrong or where they feel like they get the same feedback every time with no idea how to improve, it’s tempting to think the students aren’t taking your feedback on board but we have to be honest with ourselves and say that it might be our feedback which is not having the desired effect. A lack of good feedback impedes progress and a lack of progress creates disillusionment and stress around assessments.

Further Reading

Advertisement

Making Data Work: not as boring as it sounds

Making Data Work, perhaps not the title of a report you would run to the photocopier to pick up to read whilst on duty. That said this report, published by the Workload Advisory Group and accepted in full by the Department for Education, is a searing indictment of current practice around data in schools and should have a huge impact on the way assessment and performance management is done in schools across the UK.

Sadly the government’s lack of desire to put any of the reports recommendations into law and the fact its findings have been mostly aimed at school leaders than middle-leaders and classroom teachers have left it to the unions and individual teachers to disseminate the findings and make sure policy within schools is informed by them.

Before you set your next performance management targets and before you review your assessment policy you really need to make yourself familiar with where our profession stands on the use and mis-use of data. I hope this overview of the three most important sections for classroom teachers can assist in getting the word out.

Making Data Work

Performance Management

When I attended an NASUWT briefing on this report the part which naturally took up most of the discussion was on performance management and, especially, the use of data targets in judging teacher performance. It is refreshing to be able to quote part of this report, written in bold text, which I think every appraiser and HoD needs to be aware of

pay progression should never be dependent on quantitative assessment metrics, such as test outcomes

This is such a crucial finding of the report that NASUWT have produced a poster which contains little else but this section of text. There is nothing here that was not already common knowledge amongst the profession but, seeing it in print, in a government document is a huge step forward for tackling issues of workload and data misuse.

Current practice in using pupil attainment data in teacher performance management systems is often poor. Research demonstrates that using quantitative metrics to judge teacher performance is difficult since few of the practices that we can straightforwardly codify and measure are highly correlated with teacher quality.

Those ‘quantitative metrics’ do not just mean test scores but also graded lesson observations and scores following book scrutiny. You would be well within your rights to print out pages 17 and 18 of this report and take them with you to your next PM meeting. In fact, I would say that if you don’t you are making the chances of pay (and career) progression for yourself less likely. The NASUWT has discouraged its members from setting data-driven targets for a while but now there is a report, accepted by the DfE and Ofsted, that backs them up on this.

Predicted Grades

“There is not currently any evidence that setting students challenging attainment targets is motivational for them”

We should try, as far as possible, to make predicted grades just that; predictions of the grades our students will achieve. There is no evidence that ‘aspirational’ target grades act as a motivational force for either students or teachers and they can sometimes become counter-productive in increasing stress for staff and students alike.

“Flight paths for pupils with similar starting points are not valid”

You may not be shocked to hear that those perfectly straight, colour-coded lines which are used to determine the progress a student makes from KS2 through to GCSE results are not accurate reflectors of the teaching and learning process. Within my subject there can be big variances from one assessment to the next in terms of difficulty of material, availability of notes and writing frames for students and time since the material was taught, not to mention the day-to-day variances in student behaviour, understanding and mood.

The report gives the example of Linton Village College, a secondary school in Cambridgeshire. They scrapped whole-school data drops and marking frequency requirements and set out to pursue a “meaningful and manageable” policy where subject teams have assessment deadlines that sit within curricula and subject leaders have greater freedom and autonomy over assessment and data within their departments.

Advice to Schools

This part of the report is the clearest in setting out what schools should do to mitigate the poor use of data. The recommendations should ensure that the use of data has a clear purpose which is relevant to the audience for which it is collected, that there is a precision about how data is collected and what can be inferred from it and that the amount of data and the frequency of its collection is proportionate.

Here are the key recommendations which you need to consider whether you are an NQT or a headteacher.

School and trust leaders should:

  • minimise or eliminate the number of pieces of information teachers are expected to compile
  • understand the quality and purpose of assessments being used in their school

School and trust leaders should not:

  • have more than two or three attainment data collection points a year, which should be used to inform clear actions
  • make pay progression for teachers dependent on quantitative assessment metrics, such as test outcomes

Unfortunately, though the report was commissioned by the DfE and both Ofsted and the DfE have ‘accepted its findings’, there is no move towards implementing any of its recommendations through statutory instruments (as far as I can see). Instead it is up to the unions and individual teachers within schools to disseminate these nuggets and use it as a tool to reduce workload, stress and attrition from the profession. I suspect many will be put off from even skimming it due to its dreary name and a perceived disinterest in ‘data’ but I have found it a very useful tool to have in my arsenal when discussing something which is now one of the top three concerns of teachers working in this profession.

Making Data Work – the report in full – https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/754349/Workload_Advisory_Group-report.pdf

Joint Letter to School Leaders on the Report – https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/753668/Reducing_Teacher_Workload_-_Letter_to_School_Leaders.pdf

NASUWT Briefing on the report – https://www.nasuwt.org.uk/uploads/assets/uploaded/719c0ffa-4ee1-4cf0-9afeebcce8cd6f35.pdf