Guest blog by Laura Bailey, Head of Moderation and Assessment at Pobble.

Accurate and consistent writing assessment isn’t about ticking boxes. It’s about understanding pupils as writers: their choices, voice, and development over time. Yet across many schools, confidence in writing judgements varies. Too often, inconsistency arises because judgements aren’t built on a shared, robust evidence base, allowing unconscious bias to creep in.
The answer? Moderation. Not a token end-of-year exercise that only a quarter of schools experience externally in Year 6, but a purposeful, embedded part of a school’s assessment culture.
Moderation as professional development
Rather than seeing moderation as an accountability exercise, we should recognise it as one of the most valuable forms of professional development available. When done well, it builds shared understanding, challenges assumptions, and deepens collective expertise in writing assessment.
So, assessment leads, start by asking: Do all my staff truly understand how to interpret the writing assessment criteria we use?
If the answer is no, that’s your starting point. Begin by using neutral evidence collections to help staff interpret the criteria accurately. Then, create opportunities for whole-school discussion and agree on what independent skill application really looks like in practice. This shared understanding must reach every year group, not just those with statutory frameworks.
Structure, facilitation, and culture
Effective moderation doesn’t just happen. It relies on structure, skilled facilitation, and a culture that values deep professional discussion. Yet too often, teachers describe rushed or surface-level sessions where writing isn’t even read in full. But let’s be clear: it is simply not possible to make accurate judgements about writing skill application without first experiencing the writing, in full, as a reader. Only by hearing or reading a piece as it was intended can we genuinely evaluate its effectiveness, coherence, and impact. This is also why pupil conferencing with the writer is so critical; it allows us to explore their authorial choices, intentions, and reflections, deepening our understanding of both the process and the product.
Trained facilitators keep discussions constructive and evidence-based. When moderation leads to teachers adjusting judgements, it’s not a problem, it’s professional growth.
And central to this process is removing unconscious bias. This starts by being evidence first in moderation, looking closely at what the writing shows before discussing wider context. When we mark a test, we don’t begin by listing a pupil’s personal challenges; we start by reviewing their learning. Writing assessment should follow the same principle: discuss the writing first, then the child holistically.
Another way to reduce bias is to stay focused on skill application and demonstration of standards. Too often, a “target judgement,” the level we expect or hope a pupil will reach, clouds objectivity. Fiercely pursued targets bias writing assessments time and time again. By anchoring moderation in real examples of skill application, we guard against assumptions and ensure our discussions remain grounded in evidence, not expectation.
Ultimately without shared understanding, are we truly assessing writing skills or simply interpreting them through our own individual lenses?
Stop the autumn term frustration
Every autumn, the same frustration echoes around staffrooms: “Well, I don’t know how the child was awarded that judgement last term.” It’s disheartening and entirely avoidable. End of year transition is incomplete without cross year group moderation.
If Year 4 teachers aren’t engaging in evidence-based discussions to celebrate and review end of Year 3 assessments, and if Year 2 teachers aren’t reflecting with Year 1 on how provision was adapted to meet learners’ needs, sharing writing samples as part of that dialogue, then professional continuity is lost. Transition should never just be about handing over data; it should be about reading writing together, discussing independence, and aligning expectations.
Isn’t it time we stopped questioning the data and started strengthening the professional conversations that create it?
From moderation to curriculum impact
Once moderation ends, take time to reflect on pupils outside the sample. Do your discussions prompt you to revisit other judgements? This reflection is a critical act that further strengthens accuracy across the cohort and builds professional trust in your data.
Use what you’ve learned to refine, not always change, your curriculum. If pupils demonstrate independence, stamina, and connection with writing, identify what enables that. Share and scale those strengths. Moderation should identify patterns and CPD needs.
Writing assessment does, and will improve through practice, discussion, and reflection.
Ultimately no data analysis, however detailed, can drive improvement unless every member of staff trusts the accuracy of the judgements it’s built on. Without shared accuracy and professional confidence, the story it tells about our pupils simply isn’t complete.
If this post resonates with you, you might like to join Laura for a webinar she’s leading in January – register here.

Leave a Reply