It’s been confirmed that pupils will get their GCSE results tomorrow. In each subject they’ll receive the higher of the grade their teacher set for them (their centre assessment grade) and the grade that Ofqual’s moderation process would have awarded them.

We can get an idea of some of the things that are likely to crop up by returning to data gathered by FFT in June on the grades that schools were proposing for their pupils.[1]

It’s worth saying here that we don’t know if the final grades that schools submitted to the exam boards were the same as those they submitted to us, but figures that Ofqual has published have made the two things look broadly comparable.

We also know that moderated grades will boost some pupils’ results – but the impact of that is much harder to call.

So what are we likely to see in this year’s results?

Variation by subject

In the centre assessment grades that we collected there was quite considerable variation by subject.

At grade 4 or above, we saw 78% of pupils given a grade 4 or above in maths, compared to 73% last year. In English language, the figures were 81% and 71%, respectively.

And the increases were greater in other subjects: in computing, for example, 79% of pupils received a grade 4 or above as a centre assessment grade, versus 63% in last year’s exams.

Use the chart below to explore the difference that we saw between 2019 and 2020 grades.

When we looked into it, we concluded that subjects that had historically been graded the most harshly were those that were seeing the biggest increases under centre assessment.

KS4 Early Results Service

This year’s FFT Early Results Service will be more important than ever as the DfE and Ofsted will not be providing schools with any analysis of performance at KS4.

Sign up for the Early Results Service here.

Not an FFT Aspire user?

Learn more here.

Disadvantaged pupils

The likely impact of this year’s arrangements on particular groups of pupils has rightly received attention, including from the Education Select Committee.

We’ve been thinking about the likely effect on disadvantaged pupils. It could well be that attainment gaps actually narrow as a result of using centre assessment grades, though we stress that this is a hypothesis, not a fact.

Let us take you through our thinking process.

We’ll start with the fact that lower-attaining schools seem to have submitted the most generous grades, on average.

Using the data we collected, the chart below shows the change in schools’ average point score between last year and this year, plotted against schools’ 2019 average point score.

Those with lower average point scores in 2019 – towards the left side of the chart – typically proposed a more generous mix of grades (relative to their performance in 2019) than schools with higher attainment last year.

As we noted here, some of that will be regression to the mean. In a typical year, some schools with poor results bounce back the following year. But we think that some of it comes down to being more optimistic with results than other schools.

Given that we know that schools serving deprived areas tend to have lower attainment, we wondered whether this change in schools’ results relative to one another would mean that the disadvantage gap would close this year.

The chart below compares schools’ free school meals eligibility rates to the change in their results between 2019 and 2020 that the data we collected would imply.

What we see is a weak relationship between the two things: schools serving more disadvantaged intakes submitted only fractionally more optimistic grades on average.

But the disadvantage gap doesn’t just come down to the gap between schools – it also comes down to the within-school gap. And Ofqual itself says that it identified “leniency” in centre assessment grades around the grade 4 boundary (see Annex M of its technical manual). There are a disproportionate number of disadvantaged pupils whose results sit below a grade 4 in a typical year, so this leniency may have worked in disadvantaged pupils’ favour.

Ofqual is likely to put some figures out on this in the coming days, but it will only be later in the year that the pupil-level data required to do a thorough analysis will become available.

Independent schools

One of the biggest areas of controversy in last week’s A-Level results was the improvement that independent schools saw relative to other schools. At grades A*-A, there was an increase of close to five percentage points in independent schools’ results – more than double the increase seen in much of the state sector.

The data we collected in June covered state schools only, and as such it’s hard to say definitively how independent schools’ GCSE centre assessment grades will have compared to those of state schools.

But we think that a certain amount of the discrepancy at A-Level came down to independent schools’ smaller cohort sizes, meaning their centre assessment grades were used in preference to their moderated grades under Ofqual’s approach.

Such a distinction in how grades are awarded will no longer be the case at GCSE (nor A-Level, once results are reissued), so it’s possible that we’ll see less of a discrepancy between the improvement in results recorded by independent schools, and by state schools.

Our analysis

Normally, when results are handed out to GCSE, A-Level and AS-Level students we’re quick to look at the national data that’s published alongside, to give our take on things and to add the latest figures to our results microsite.

National, subject-by-subject figures won’t be published tomorrow though, given the speed at which the process has been altered. But you can be assured that we’ll blog if there’s anything else that catches our eye tomorrow, and will be back with full analysis of this year’s results when the national figures are made available.

Want to stay up-to-date with the latest research from FFT Education Datalab? Sign up to Datalab’s mailing list to get notifications about new blogposts, or to receive the team’s half-termly newsletter.

Notes

1. Our analysis in this post is based on the data of around 1,900 schools who submitted their results to FFT, ahead of the deadline to submit them to the exam boards. Comparisons between 2019 and 2020 grades are based on the same schools: we matched a school’s results in a given subject in 2019 to their results in 2020. In much of what we talk about, we’ve focussed on schools that have more than 25 entries in a given subject, and subjects with at least 100 such schools.