For the academic year 2017/18, the SRA has disclosed, the proportion of students who passed the Legal Practice Course ranged from 29% to 100% among 25 postgraduate institutions.
According to the regulator, it is ‘unclear what the reasons are for such a wide disparity in performance’. How exasperating. Aspiring solicitors who often amass huge debts to enter the profession were (and are) surely entitled to information that would enable them to make an informed choice of training provider. One might have thought the SRA would have troubled itself to find out.
However, it is not quite as simple as that. The SRA notes that its job is to not to regulate law school courses. This falls to the Office for Students, which has the same data but intervenes only if it has ‘concerns’. Whether the OFS has ever intervened is unclear.
Instead, all LPCs must have external examiners in place, and courses submit annual reports to the SRA. ‘Where issues are flagged up, we can and do follow them up,’ the regulator insists.
It wasn’t always thus. Law schools were once visited and rated on a scale of ‘excellent’ to ‘unsatisfactory’, but this system was discontinued about the time the SRA came into being over a decade ago. Progress, eh?
Still, we are where we are. Couldn’t the SRA put the names of the institutions to the percentages? Apparently not. This ‘could create pressures on providers which might impact standards’.
Well, er, yes. That would be the point. A league table might have done wonders for standards.
As for the institutions that scored 29% and 100%, two questions spring to mind. Should the former be accredited at all? Is the latter really so good – or could its marking do with a little scrutiny? No one is telling.
Still, at least the new super-exam should help. Since the SQE is its own creation, the SRA can – and says it will – disclose the pass rates of each institution.
This will be a most welcome and overdue boost to transparency. A centralised assessment has its critics, but this is one of the pluses.