The latest calamity to befall everyone’s favourite regulator of barristers?
Bar Professional Training Course (BPTC) students are blaming the BSB for screwing up the centrally set criminal and civil litigation and ethics exams they sat earlier this month. The wannabe barristers describe the exams as “disastrous” and “unfair” – and are worried their future careers could be scuppered as a result.
Here are the students’ complaints…
1. The exams tested knowledge that “cannot reasonably be expected from a student” – because it wasn’t in the syllabus.
2. The questions were so poorly worded it made it “difficult to navigate the papers”.
3. The exams “went well beyond the ‘what should a practitioner know by heart’ standard, into abstruse and irrelevant points”.
The response of the BSB, which is responsible for appointing the members of the central examination board that set the exams, was essentially that they’d look into it and, if necessary, adjust the marks.
OK, so a moment of sympathy for the BSB. It must be a pain dealing with smart arse wannabe barristers.
But in view of the BSB’s recent bad run, which goes beyond the much maligned advocacy research, you’d be inclined to give the students the benefit of the doubt.
The correspondence between the students and the BSB is published in full below.
16th April 2012
Bar Standards Board 289-293 High Holborn London WC1V 7HZ
Re: Complaint about the centrally set exams (“the Exams”)
We are writing this letter of complaint as a collective response to the Exams held on the 11th and 12th of April 2012. We understand that this is the first year when the Central Examination Board (“CEB”) -under the auspices of the Bar Standards Board (“BSB”)- has set the Exams. These are now regarded by a large number of BPTC students from different providers as being unfair. This applies both to the setting and the content of the papers.
You will be aware that all students were notified at the start of the Civil and Criminal Litigation Exams of a change to the marking rubric for the Short Answer Questions (“SAQ”). Rather than indicating how many sub-points were required to gain full marks, as in all the mock papers provided by the CEB/BSB prior to the paper, candidates were expected to use their “common sense” to answer questions fully. This is clearly unfair. Firstly, it cannot be a good practice to declare a fundamental change in the rubric at an exam hall just before the start. It was distressing to many students who had prepared to answer a paper formatted according to the guidance and examples provided.
Further, the change will have a direct impact on the marks awarded to candidates and has the potential to greatly affect an over-all grade. Common sense would usually dictate that to gain the full 3 marks for a question, that question could reasonably be fully answered in 3 short answers. If that is not the case, and six short answers are required with a half-mark value each, that should be clearly indicated. To penalise candidates for not appreciating that more points were needed to answer a question than the marks allocated to that question, when no candidate would have had reason to practice this particular skill in advance of the Exams, is wrong.
More importantly, there were real problems with the content of the Exams. The BSB’s stated purpose is to “ensure that students intending to become barristers acquire the skills, knowledge of procedure and evidence, attitudes and competence to prepare” for the rigours of Pupillage.
The content of the Civil litigation paper, in particular, went far beyond a thorough and precise knowledge of the syllabus provided to us. The mock reflected the syllabus and indicated the standard of specialised knowledge expected. The real paper bore no resemblance to the mock papers in this respect. If candidates had been allowed to take the paper away after the Exams, we would be able to conduct a more forensic analysis. As the matter stands, the most memorable irregularities are:
MCQ 31, on ‘full and frank disclosure on interim injunctions’, referred to matters ‘in the application’s actual knowledge’. This issue does not appear in the syllabus. Applications do not have knowledge, although applicants do, and given that MCQs often turn on fine verbal distinctions it is not good enough to say this was ‘an obvious typo’. Even a candidate with the White Book open during an exam would have had to weigh the wording of CPR Part 25 against the leading authority of Siporex Trade SA v Comdel Commodities Ltd  2 Lloyd’s Rep 428.
MCQ 40 Another example is this MCQ on serving a ‘summary of evidence’ that the witness could give, which does not appear on the BSB syllabus we were given.
SAQ 4 was particularly abstruse: this question on default judgment went far beyond testing knowledge of the full procedural elements of variation/setting aside. Default judgment is on the syllabus, but it seems unjust to go to that level of detail.
The Criminal paper had fewer problems. However, we feel it is inappropriate to penalise students for lacking a specific piece of knowledge by testing for that knowledge over a number of questions. For example, a minor aspect of the relationship between s.76 and s.78 of PACE 1984 determined the correct answer for a number of questions over the MCQ/SAQs. Similarly, there was more than one question on the permissible majority verdicts on juries. The Criminal paper, therefore, accorded disproportionate weight to these matters.
Although this exam took place on the 26th March, this paper again focused on an abstract academic and theoretical understanding of ethical problems, rather than testing a student’s knowledge and application of the Code of Conduct and the documents listed on the syllabus. It again bore no relationship or similarity to the mock exams provided by CEB, and simply was not a paper that we had prepared for. As such, many students considered it unreasonable in content.
As a result, these Exams are perceived to have been unfair:
(a) testing knowledge that cannot reasonably be expected from a student or Pupil barrister;
(b) in some cases was not on the syllabus at all;
(c) poorly worded questions made it even more difficult to navigate the papers, which in turn also affected timing in answering the other questions on the papers. In most cases students were left with too little time to answer the SAQs properly.
These Exams went well beyond the ‘what should a practitioner know by heart’ standard, into abstruse and irrelevant points.
The consequences of these disastrous Exams are far-reaching. For all of us, it marks the culmination of a year of hard work and preparation. A fail grade seriously jeopardises the careers of those of us expecting to be Called this summer and begin Pupillage in October. It will also adversely affect the chances of those seeking to obtain Pupillage. These are professional examinations, and they should have been set to a professional standard. They were not, and it seems that only the students will suffer as a result.
The pass mark for both the Exams must be scaled down. We therefore propose a more lenient approach towards marking of the scripts. The strict condition of passing individual parts of the Exams (SAQs and MCQs) with a high threshold of 60% should be substituted to a combined pass requirement as with the Alternative Dispute Resolution paper.
It is therefore submitted to the CEB and the BSB to act on this complaint immediately to prevent conscientious students from suffering the consequences of what we strongly believe to be an unjust examination process.
The undersigned (Please find the signature-sheet attached with this letter)
RESPONSE FROM THE BSB
“The BSB has formal procedures for obtaining feedback from Provider institutions regarding content of the centrally set assessments, including reference to issues such as syllabus coverage, level of detail required, and whether there are any concerns relating to whether or not the questions are fit for purpose, or whether the solutions are appropriate. The external examiners attached to each Provider will also be reporting on the assessments at the institutions for which they are responsible.
Evidence gathered through this feedback process can be used by the central examiners in determining whether, in the event that there is a statistically significant deviation in the level of candidate performance that might have been expected, there is a case for cohort marks being adjusted (usually through scaling of marks). The Central Examination Board will also have access to statistical data on student performance, and expert statistical analysis to assist in the interpretation of this data.
Students are, of course, free to make whatever comments they wish about the assessments to Provider institutions. It is for the Provider institutions to reflect on whether any such comments raise substantive points of principle that they feel need to be included in the formal feedback.”