No More Marking: a more reliable approach to writing assessment

How Assembly works with No More Marking to gather reliable writing assessment data

There has been much concern around the reliability and validity of teacher assessments of written English. While the DfE publishes the teacher assessment frameworks for 2017/8, the education community continues to debate their validity and use as a progress measure. You can follow the discussion on twitter and elsewhere.

In order to include writing teacher assessments in progress measures at the end of KS2, they have to be converted to a nominal score. The following nominal scaled scores have been set for the TA bands for writing at KS2.

You can see that these scores don't correspond with the WTS, EXS, GDS for Reading and Maths and there is no scale underpinning the scores, meaning all subtleties within the bands are lost. A pupil could be working at the very top end of EXS, yet be given the same nominal score as a pupil who only just reached that standard. This inevitably puts pressure on teachers to assign grades at a higher level, given the high stakes involved. In addition, concerns have been expressed about the reliability of moderation. OfSTED have said that they will treat the 2017 writing TA's 'with caution' as they do not consider them to be sufficiently reliable.

What is No More Marking and comparative judgement?

No More Marking is an Assembly partner whose powerful “comparative judgement” technology provides a more reliable approach to writing assessment. We have written a 'Partner in Focus' blog about how No more Marking and Assembly work

Comparative judgement works on the principle that we are better at making judgements between things than we are at making holistic judgements of single items out of context. We can tell whether something is hotter or colder than the other, without being able to identify the exact temperature of either.

No More Marking have an excellent demonstration on their website that explains comparative judgement. No More Marking have developed a way of achieving reliable and valid scaled scores from English writing tests. This means pupils get a numerical score, rather than a best-fit band. This eliminates the confusion created by pupils just in or out of the band.

The scale is age independent, so scores should increase as the pupil gets older and their writing standard improves. The scale of scores runs from 200-800 and is maintained across all year groups. The WTS, EXS, GDS thresholds are superimposed on this scale to maintain familiarity and consistency across reading, writing and maths analysis. This article explains their powerful reports and the theory behind them.

What does this look like in Assembly Analytics?

Assembly’s partnership with No More Marking means our automated integration also allows us to extract writing assessment data from their system for all schools in your MAT, and display the aggregated results on an Assembly Analytics dashboard, alongside your other school performance data. We apply the same Expected+ and High Score boundaries as used by No More Marking, in order to give you a clear view of the relative performance of all year groups from 1 to 6. Clicking on a summary measure gives you the breakdown of results across your schools and year groups.

We also combine the Writing scores with your Reading and Mathematics results to create a set of “RWM” summary cards to show Expected+ and High Score at your MAT.

By using comparative judgements from No More Marking and standardised assessments from RS Assessment from Hodder Education you can establish a system of reliable and valid attainment and progress measures for all year groups, across all your schools, up to three times per year. This allows you to explore areas of strength and weakness and to target your school improvement strategies effectively. And, because we automate the data aggregation process for you, partner data appears in your Assembly Analytics dashboard with no manual work for you, reducing time and teacher workload!