131
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Quiz design as assessment in non-STEM undergraduate law courses

ORCID Icon
Received 23 May 2023, Accepted 05 Apr 2024, Published online: 08 May 2024

ABSTRACT

In 2020, the author introduced an assessment at the University of Auckland, Faculty of Law in which students design a multiple-choice question (MCQ) quiz on a course reading. The assessment was inspired by the PeerWise tool (primarily used and studied in STEM subjects) and informed by contributing student pedagogy. The assessment was offered in three courses, each time requiring less workload (questions to write) relative to course points, and students were surveyed in each instance. This article surveys the literature on quiz design as assessment, describes how the assessment works, presents and discusses the students’ responses to it, and recommends how other law teachers could use the assessment. The survey findings include that: the assessment increased student interest in the content of their assigned reading; the assessment increased student engagement with the course materials generally; the assessment was more enjoyable than working on a typical research essay or problem question; the time required was similar to or less than that required for a typical law assessment; and the assessment was valuable for their learning. The study endorses a five-MCQ quiz for 10% on a reading, comprising multiple sources or a varied source, in a course requiring 150 hours of learning.

Introduction

In 2020, the author introduced a new assessment at the University of Auckland, Faculty of Law in which students design a quiz on a course reading. For each quiz, students are asked to: write multiple-choice questions (MCQs) with one correct answer and three distractors; explain why option (a) is correct and why the other options are incorrect; note the reading pinpoints that a student would need to read to solve the MCQ and quote the relevant sentences; and briefly explain why the MCQ is a good question. The assessment was offered in three courses in different iterations and students were surveyed in each instance.Footnote1 The assessment was intended to benefit students by facilitating deeper engagement with a course reading, increasing variety in the forms of assessment in the law degree and encouraging students to learn from each other. The assessment also had a collateral purpose in generating quizzes that can be linked to the relevant reading in future iterations of the course and provided to students for revision purposes.

The article is in four substantive parts. First, it reviews the relevant general literature on quiz design as assessment and considers its relevance in the law school context.Footnote2 Second, it explains how the quiz design assessment works. Third, it sets out the study methodology, including the changes that differentiate the three iterations. Fourth, it presents and discusses the results.

Literature review

Some assessments are more effective than others at actively engaging students in the learning process.Footnote3 A common form of assessment, the MCQ quiz, has often been criticised for encouraging surface learning only rather than deep learning.Footnote4 This has prompted some educators to adopt assessments requiring students to author MCQs – an activity which presents an opportunity for an enriched learning experience and a greater cognitive challenge.Footnote5

While the MCQ quiz is a useful summative assessment tool for measuring learning attainment, the ideal form of assessment also promotes learning – a concept referred to as “assessment for learning”.Footnote6 MCQ authoring assesses for learning because it requires students to undertake deep learning – for example, when evaluating the key aspects of the reading that should be tested, articulating clear, challenging and fair MCQs, considering each MCQ’s relevance to the learning outcomes,Footnote7 crafting correct answers and plausible distractors, and writing explanations.Footnote8 In doing so, students process, organise, integrate and reconstruct knowledge – skills which encourage higher-order thinking.Footnote9 As such, the assessment for learning transcends rote learning and parroting what they have learned to allow students to apply what they have learned in novel circumstances, and think critically and creatively in doing so. MCQ authoring can also improve students’ ability to identify when they have a flawed understanding of course content, more effectively than if they were to answer MCQs only.Footnote10 The MCQ authoring process tests a student’s ability to clearly communicate their knowledge to others.Footnote11 Often, difficulty authoring a MCQ will be due to gaps in comprehension, and students will need to revisit the content to enhance their understanding.

MCQ authoring favours active rather than passive learning. The benefits of active learning can be explained through constructivist learning theories, which suggest that students learn best when actively constructing their own meaning rather than acquiring it passively from instructors.Footnote12 By involving students in assessment design, teachers are encouraging co-creation – a constructivist learning process that enables learning through the active construction of knowledge while facilitating deep and active collaboration between students and teachers.Footnote13 More common forms of active assessment include self-assessment and peer-assessment, which have been more widely researched than quiz design.Footnote14

Recently, a new approach called contributing student pedagogy, which blends social constructivist pedagogy and community-based learning,Footnote15 “encourages students to contribute to the learning of others”,Footnote16 which establishes an awareness among students that their contributions are used to facilitate the learning of others.Footnote17 Students are assessed on the suitability of their contributions for that purpose. Students are also encouraged to “value the contributions of others”.Footnote18 Designing content for others motivates students to dedicate the required time and effort to produce good quality work while enhancing their own learning.Footnote19 In courses that struggle with student engagement, approaches that require students to be more conscious of the quality of their work may motivate students to understand the content more deeply. Contributing student pedagogy may help to improve law student engagement with course readings at a time when many law teachers have difficulty ensuring students do their readings. Further benefits include generating learning resources from student MCQs that later become materials for revision.Footnote20 Delegating the generation of learning materials to students can save staff a substantial amount of time and cost, while also indicating what topics students are confident and engaged with (by the topics they have chosen to write their quizzes on).Footnote21 Ultimately, involving students in assessment development “put[s] the educational process in focus”, empowering students with greater control over their learning.Footnote22

Several studies have investigated MCQ authoring as assessment in a number of disciplines. The methodologies used vary. Students are often asked to respond to surveys. Sometimes their responses are benchmarked against the quality of the MCQs authored or student academic performance. Most of the studies have been conducted in courses in STEM subject areas.Footnote23 It is unclear whether non-STEM subjects, such as law, would produce similar results. Thus, MCQ authorship in non-STEM subjects needs to be studied further.

MCQ authorship is most studied on PeerWise, especially MCQ authorship in STEM subjects.Footnote24 PeerWise is an online tool that allows students to author MCQs, answer MCQs authored by their peers, explain their answers, and provide ratings for and feedback about others’ MCQs.Footnote25 The PeerWise repository is empty at the beginning of the course.Footnote26 Once MCQs are submitted, students can answer MCQs as a form of self-assessment.Footnote27 Upon answering a MCQ, a student can immediately view the author’s explanation, as well as the results for that MCQ so far, to gauge how they performed relative to their peers.Footnote28 Students can write comments on a MCQ, which might be used to discuss, clarify or correct aspects of the MCQ.Footnote29 Students may also use the rating feature to evaluate the quality of the MCQ and helpfulness of the explanation.Footnote30

PeerWise promotes active engagement with course concepts. For example, it prompts students to: revise course concepts;Footnote31 contemplate misconceptions about course concepts;Footnote32 consider course concepts in relation to the course learning outcomes;Footnote33 and evaluate the quality of MCQs, requiring sound understanding of course concepts.Footnote34 It facilitates collaborative and interactive learning between students, including by generating useful MCQs and explanations for revision.Footnote35 It allows students to track their performance: the tool automatically generates and provides each student with a dynamic PeerWise score, which is composed of individual scores for MCQ authoring, answering and rating.Footnote36 Further, it allows instructors to moderate the content of MCQs.Footnote37

One study of second-year biology students (54 respondents) found that 78% agreed that authoring MCQs improved their familiarity with the lecture materials, and 77% agreed that the exercise made them think and reflect on course concepts.Footnote38 In a similar study of second-year biochemistry students (107 respondents) who used PeerWise, 70% reported that PeerWise helped their learning, agreeing that developing an original MCQ on a topic reinforced what they knew about the topic and improved their understanding of the material.Footnote39 In contrast, some studies have reported a negative attitude towards MCQ authoring, with one study of fourth-year medical students (106 respondents) finding that only 24% agreed that it improved their learning experience, and 31% agreed that it was beneficial to their learning – though the authors acknowledged that these findings were contrary to most of the literature on MCQ authoring.Footnote40

In a recent study of second-year medical students (18 respondents) asked to author MCQs, students generally found that writing the answer options was challenging yet rewarding, requiring students to make subtle distinctions between the correct answer and the distractors.Footnote41 However, students generally expressed concern about the content knowledge required to write MCQs and the time required for the assessment, which the authors suggested might limit the assessment’s value as a learning exercise.Footnote42 Also, students generally agreed that receiving training on authoring MCQs benefitted them and changed their approach to answering MCQs.Footnote43

A study of fourth-year medical students (174 respondents) invited to author MCQs on PeerWise as an optional activity found that only 32% authored a MCQ.Footnote44 Students’ open-ended responses indicate that 18.4% lacked confidence in their ability to author good quality MCQs and 23.5% lacked the time.Footnote45 Other studies have identified similar issues. One study of first-year medical students (384 respondents) found that 26% agreed that MCQ authoring, which targeted higher-order thinking, was more challenging and time-consuming than anticipated.Footnote46 In a study of three cohorts of second-year biology students (226, 193 and 237 students, respectively; number of respondents unclear) who were asked to author MCQs on PeerWise, 92% agreed that PeerWise activities improved their understanding of the course by “a lot”. However, the following year only 55% agreed.Footnote47 The negative responses largely concerned the peer-assessment MCQ rating feature on PeerWise, which could be used to generate “peer-dependent” PeerWise scores,Footnote48 although again some students believed that the effort expended for the assessment was not adequately reflected in the marks available (4% of course assessment in the first and second years and 5% of course assessment in the third year).Footnote49

Several studies have investigated the effect of MCQ authoring on student marks. A study of science students (854 participants across five courses) found a “modest but positive relationship” between student activity on PeerWise and exam performance, even with prior ability factored in,Footnote50 and students with lower or intermediate ability levels were found to have benefitted the most.Footnote51 However, it was noted that it is unclear which aspects of PeerWise led to these benefits – whether MCQ authoring, answering or discussing others’ MCQs, or providing ratings and peer feedback on MCQs.Footnote52

Other studies have reached similar conclusions. A study of second-year biomedical science students (107 respondents) found a significant correlation between students’ PeerWise scores and overall course marks.Footnote53 Similarly, a study of second-year biology students across three different cohorts (226, 193 and 237 respondents) found a significant correlation between students’ PeerWise scores and overall coursework and exam marks.Footnote54 However, despite this, there was no similar correlation of statistical significance between the quality of a student’s MCQs and their overall performance in the course.Footnote55 The authors suggested that any improved academic performance following engagement on PeerWise should not be solely attributed to students authoring high-quality MCQs.Footnote56 Rather, improved academic performance seemed to result from the wider set of activities offered by PeerWise – for example, authoring, answering, discussing and providing feedback on MCQs.Footnote57 While these findings indicate clear learning benefits from engaging with PeerWise over ordinary “drill-and-practice” MCQ answering,Footnote58 they do not indicate the benefits of MCQ authoring alone. Since PeerWise is a tool that also provides other functions which facilitate peer-to-peer discussion and collaborative learning, the results of PeerWise-related studies should be taken not as demonstrating the benefits of MCQ authoring in isolation, but rather as indicating the benefits of requiring or inviting students to use the tool.

One study suggested that it was not authoring and answering MCQs but engagement with PeerWise overall that led to performance in the exam. In that study, only the number of MCQs authored by the student on others’ MCQs and the number of days active correlated with exam performance.Footnote59 However, these results should not be taken to suggest that MCQ authoring alone outside the PeerWise tool yields no benefits for student performance. The potential benefits of MCQ authoring on learning are well documented. Further research is needed to determine the impact of MCQ authoring specifically on student performance.

Some studies have focused on the quality of the MCQs authored by students. The student-generated MCQs should be correct because they will be available to other students to use for their learning. The correctness of MCQs is also of interest to teachers because it helps to inform them whether students are engaging with the activity appropriately and effectively.Footnote60

Other studies have considered whether students are engaging in a “deep approach” to learning. Such studies have acknowledged that it is difficult to measure the depth of learning except to rely on students’ evaluations of their learning. It is also unclear what the quality of students’ MCQs indicates about their cognitive engagement and depth of learning.Footnote61 A study has produced evidence that MCQ authoring targets higher cognitive levels of thinking.Footnote62 However, another study has observed that over 50% of student-authored MCQs “included at least one rote memorisation sub-part”, testing recall rather than application.Footnote63 This might indicate that it took more work for many students to generate cognitively demanding MCQs testing the application of knowledge than MCQs testing simple memory recall. In one study, while 91.2% of student-authored MCQs were correct, most MCQs were classified as targeting the question-taker’s lower-order thinking only, when benchmarked against Bloom’s taxonomy.Footnote64 Nonetheless, the authors were confident that the MCQ-authoring process, the creation of distractors and explanation of answers were tasks that necessarily involved deep engagement with the course material.Footnote65 Even authoring MCQs that require only low levels of cognitive engagement to construct has been shown to, at minimum, foster student engagement by making students familiarise themselves with the relevant course materials.Footnote66

The literature on student quiz design identifies numerous benefits to student learning, such as encouraging deeper thinking and stronger engagement with the course materials. Overall, student attitudes about MCQ authoring as assessment are largely positive, although some commonly identified challenges emerge, such as difficulty and workload issues. Of course, how the assessment is designed – including the time allocated, the number of MCQs required, the quantity and difficulty of the course material, whether the assessment is compulsory, optional for extra credit or optional for no credit, and the workload and pressure in the overall course – are each likely to affect student responses, which may limit the usefulness of comparing studies.

Quiz design assessment

The quiz design assessment required students to make a MCQ quiz based on a course reading. Students select a reading on which to author a quiz. The reading may be one source, a section (page range) of a source or multiple short sources. The total number of words for each reading is approximately the same. Students are provided with a list of readings and an online form on which they select their reading, which goes live at an advertised date and time. Students select their reading on a first-in basis.

Students comprehend the reading. Students then start authoring their MCQ quiz. Students are provided with a template on which to author their MCQ quiz. The template ensures that each student’s MCQ quiz is formatted the same, which helps the marker focus on the substance of the MCQs rather than any formatting, and makes the process of inputting several MCQ quizzes into a learning management system’s quiz tool, such as Canvas quizzes, more efficient. The template asks students for their student ID number and the reading for their quiz. The template then provides shells for each student’s MCQs:

Students are provided with a document with instructions. Students are asked to assume that a quiz-taker would be presented with the MCQs in the order the student presents them. A learning management system’s quiz tool will require the designation of an option as the correct answer. In inputting MCQs, the author has found that they are quicker and less liable to make errors if the student’s correct answers are consistently one option (either the first or last option). Therefore, students are instructed to make option (a) the correct answer and options (b), (c) and (d) the distractors. Students are told that the learning management system on which other students could take the quiz would scramble the option order. Because the options could be presented to a quiz-taker in any order, students are instructed to ensure the four options make sense in any order.Footnote67 Students are also asked to place options like “All of the options” and “None of the options” in square brackets, which can help to enhance clarity, such as for MCQs requiring students to complete a proposition.Footnote68

Students are asked to explain why option (a) is correct and the other options are incorrect. This prompts students to deeply engage with the reading, ensure their MCQ is accurate and logical and think critically about their MCQ.Footnote69 Students are also asked to note the reading pinpoints that a student would need to read to solve the MCQ, and quote the relevant sentences. This prompts students to closely familiarise themselves with the reading – especially to check that no distractors are inadvertently supported by the reading. The information also helps the marker to ascertain whether the MCQ is accurate and logical, and, therefore, suitable as a MCQ to be shared with quiz-takers.Footnote70 Students are further asked to briefly explain why the MCQ is a good question. This requires students to reflect on why they selected that aspect of the reading for a MCQ, why they authored the MCQ that way, why they provided those options, what the MCQ contributes to their MCQ quiz, and how the MCQ complements the quiz-taker’s learning in the rest of the MCQ quiz.Footnote71

Students are provided with two examples of MCQ quizzes authored by students for a trial of the assessment prior to the study period.Footnote72 Students are also provided with a marking rubric. The rubric includes criteria for learning, as well as criteria to facilitate the marking and inputting processes.

Finally, students are told that the anonymised quizzes may be made available for other students to take in that course and future iterations of the course. Students can indicate on their quiz, or let the lecturer know at any time up to one month after the course, if they do not wish for their quiz to be provided to others.Footnote73 The assessment does not have a peer rating component, which sidesteps possible student issues with the peer-dependence of the PeerWise rating feature and PeerWise score, which have been noted by science students,Footnote74 and could be exacerbated in law schools,Footnote75 which are known to be particularly competitive.Footnote76

Methodology

The author used the quiz design assessment in three undergraduate LLB degreeFootnote77 elective coursesFootnote78 for which he was the course director and lecturer. In each course, the assessment was worth 10% of the assessment, but the number of MCQs to write and the course points differed. The assessment was mandatory, meaning that students who did not attempt it forfeited 10% of their marks for the course.

In each course, the reading options were approximately the same number of pages.Footnote79 The number of pages for each option was about 2.5 times the number of MCQs the students were asked to author.Footnote80 The options were categorised on three dimensions. First, the options were one of two source counts: single; or multiple. Second, the options were one of three source compositions: complete; part; or composite complete and part.Footnote81 Third, the options were one of three source types: primary sources (for example, legislation or cases); commentary (for example, scholarship, legislation notes or case notes); or mixed primary and commentary. Thus, there were 13 option categories:Footnote82 (1) single-source complete-composition primary-type;Footnote83 (2) single-source complete-composition commentary-type;Footnote84 (3) single-source part-composition primary-type;Footnote85 (4) single-source part-composition commentary-type;Footnote86 (5) multiple-source complete-composition primary-type;Footnote87 (6) multiple-source complete-composition commentary-type;Footnote88 (7) multiple-source complete-composition mixed-type;Footnote89 (8) multiple-source part-composition primary-type;Footnote90 (9) multiple-source part-composition commentary-type;Footnote91 (10) multiple-source part-composition mixed-type;Footnote92 (11) multiple-source composite-composition primary-type;Footnote93 (12) multiple-source composite-composition commentary-type;Footnote94 and (13) multiple-source composite-composition mixed-type.Footnote95 Students could choose an option that suited their interests.

The first course, which will be referred to as C1(1.00), was Contemporary Issues in Land Law in semester 1, 2021. Students in C1(1.00) were asked to design a 10-MCQ quiz on a course-related reading option for 10% of the assessment for the course.Footnote96 This represents 10 MCQs for a 10-point course (requiring 100 hours of learning), which is a 1.0 ratio of workload (MCQs to write) to course points. The second course, which will be referred to as C2(0.67), was Māori Land Law in semester 2, 2021. Students in C2(0.67) were asked to design a 10-MCQ quiz on a course-related reading option for 10% of the assessment for the course.Footnote97 This represents 10 MCQs for a 15-point course (requiring 150 hours of learning), which is a 0.67 ratio of workload (MCQs to write) to course points. The third course, which will be referred to as C3(0.33), was Cultural Property and Indigenous Intellectual Property in semester 2, 2022. Students in C3(0.33) were asked to design a five-MCQ quiz on a course-related reading option for 10% of the assessment for the course.Footnote98 This represents five MCQs for a 15-point course (requiring 150 hours of learning), which is a 0.33 ratio of workload (MCQs to write) to course points.

In summary, the assessment was the same in each iteration, except there was a reduction in the number of MCQs to write relative to the points value of the course. The reduction responded to student feedback in a pre-study trial of the assessmentFootnote99 that the assessment required a lot of work relative to the assessment weighting and course points.Footnote100 The author acknowledges there is increasing commodification in higher educationFootnote101 and advises a cautious (not strict) adherence to workload-to-course points ratios when designing courses.

Fifteen out of 36 students (41.7%) participated in C1(1.00). Twenty-four out of 39 students (61.5%) participated in C2(0.67). Thirty-six out of 61 students (59.0%) participated in C3(0.33).

In each course, students were invited to complete a survey on the assessment.Footnote102 The survey went live after the assessment was submitted and closed before a mark was received for it. The surveys included five-point Likert scales (strongly agree, agree, neither agree nor disagree, disagree, and strongly disagree), open-ended boxes for students to elaborate, and other open-ended questions. The students’ responses to each open-ended question were coded. The coding process involved, for each question, creating codes by identifying themes in the students’ responses, and using those codes to label and organise the students’ responses.

Results and discussion

The article now presents and discusses the results. When results are provided in brackets, they are provided in the order C1(1.00), C2(0.67) and C3(0.33) unless otherwise noted. The article uses “generally agree” to mean either strongly agree or agree and “generally disagree” to mean either disagree or strongly disagree.

Interest, engagement, challenge and enjoyment

Students generally agreed (86.7%, 95.8% and 97.2%) that authoring a MCQ quiz increased their interest in the content of the reading for which they authored a MCQ quiz, with the percentage increasing as the workload-to-course points ratio decreased. Many students strongly agreed (66.7%, 75.0% and 66.7%), and most students generally agreed (93.3%, 100.0% and 97.2%), that authoring a MCQ quiz helped them to engage more deeply with the reading for which they authored a MCQ quiz. The coded open-ended responses indicate that some students considered the assessment to require cognitive engagement and deeper learning.Footnote103 The assessment is designed to prompt cognitive engagement and deeper learningFootnote104 by requiring students to apply their knowledge of the readings by processing, integrating and reconstructing their understanding.Footnote105 The coded open-ended responses indicate that many students felt an awareness of cognitive engagement and deeper engagement. Furthermore, students generally agreed (80.0%, 62.5% and 75.0%) that the assessment would likely increase their engagement with the course materials generally. However, the lower percentages for this question suggest that the assessment increased student engagement with the particular reading for which they authored a MCQ quiz more than with the course materials generally.

Students generally agreed (86.7%, 95.8% and 72.2%) that the assessment was challenging, a common comment in previous studies.Footnote106 Students perceived the assessment to be more challenging for the two courses requiring a 10-MCQ quiz and less challenging for the course requiring a five-MCQ quiz.

Students generally agreed that the assessment was more enjoyable than working on a typical research essay (66.7%,Footnote107 91.7%, 94.4%) or a typical problem question (73.3%, 79.2%, 88.9%). The coded open-ended responses indicate that not needing to do independent research to complete the assessment contributed to some students’ enjoyment. A student in C1(1.00) who responded that the assessment was less enjoyable than working on a typical research essay or a typical problem question explained that MCQ quiz authoring took too long in comparison. C1(1.00) had the highest workload-to-course points ratio; no students in C2(0.67) or C3(0.33) suggested that the assessment was less enjoyable than the alternatives because it took too long in comparison.

Time and work

In previous studies, students had commonly noted that the MCQ-authoring assessment took a lot of time and work to complete.Footnote108 This study sought to compare students’ perceptions of the time it took to complete the assessment with the time it takes to complete typical law assessments. The self-reported time it took students to complete the assessment did not uniformly decrease as the workload-to-course points ratio decreased. However, the self-reported time it took students to complete the assessment did decrease as the number of MCQs decreased from 10 MCQs for C1(1.00) and C2(0.67) to five MCQs for C3(0.33). A 10-question quiz took about 6–15 hours for a clear majority (69.2% and 68.4%) of students and a five-question quiz took about 1–5 hours for a clear majority (74.3%) of students ().Footnote109

Table 1. Student responses to “How many hours did it take you to complete the assessment?”.

There were a couple of outliers. In C2(0.67), a student considered their reading difficult to comprehend, which made the assessment more complex for them; the student reported that the assessment took them 48–60 hours. Also, in C2(0.67), a student felt that generating quality distractors was challenging; the student reported that the assessment took them 60 hours. However, this student also believed that this was the usual amount of time they might spend on a 10% assignment for a course with that points value. The outliers are consistent with concerns about workload raised by some students in prior studies. Overall, the responses on workload indicate that each student’s attitudes about the assessment will be influenced by the reading on which the student authored their MCQ quiz.

Students were asked how the time spent on this assessment compared with the time they would usually spend on a law assessment worth 10%.Footnote110 The percentage of students who considered the assessment took more time than a typical law assessment decreased as the workload-to-course points ratio decreased (50.0%, 33.3% and 12.0%), and the percentage of students who considered the assessment took less time than a typical law assessment increased as the workload-to-course points ratio decreased (30.0%, 33.3% and 72.0%). As mentioned, some studies have reported student concerns about the time and work required to complete a quiz design assessment. In this study, students were divided on whether a 10-question quiz takes more time, the same time or less time than a typical law assessment, and a clear majority of students considered that a five-question quiz takes less time than a typical law assessment.

Students generally agreed (66.7%, 87.5% and 97.2%) that the number of MCQs they needed to write was fair given the assessment weighting and course points value. As expected, the percentage steadily increased as the workload-to-course points ratio decreased. In C1(1.00), one student believed that 10 MCQs were too many for the 10% assessment in a 10-point course. Conversely, in C3(0.33), one student believed that five MCQs were too few for the assessment weighting in a 15-point course. The coded open-ended responses indicate that, while some students in C1(1.00) and C2(0.67) believed that there should have been fewer MCQs or the assessment should have been weighted more, no students suggested this in C3(0.33), perhaps indicating that requiring five MCQs in an assessment weighted 10% in a 15-point course (requiring 150 hours of learning) is a lower workload limit for the assessment.

Directions and guidance

Students were asked to reflect on the assessment design. Many students strongly agreed (73.3%, 87.5% and 72.2%), and most students generally agreed (100.0%, 95.8% and 97.2%), that the assessment instructions were clear. Many students strongly agreed (73.3%, 79.2% and 63.9%), and most students generally agreed (93.3%, 100% and 91.7%), that the marking rubric was clear; no student generally disagreed.

Many students strongly agreed (66.7%, 75.0% and 61.1%), and most students generally agreed (86.7%, 95.8% and 88.9%), that the provision of the two example quizzes authored by previous students was helpful. Previous studies have shown that students find guidance on MCQ authoring beneficial.Footnote111 Teachers should consider the type of readings used as examples. The coded open-ended responses indicate that students who wrote their MCQ quiz on a case reading found the examples – which were on an article and book chapter reading, respectively – generally less helpful.

Learning and skills development

Many students strongly agreed (53.3%, 50.0% and 72.2%), and most students generally agreed (100.0%, 100.0% and 88.9%), that the assessment gave them a good opportunity to demonstrate an understanding of their reading; no student generally disagreed. This result is consistent with the literature, which indicates that authoring a MCQ quiz requires students to understand the relevant content before they can author a quality MCQ and exposes a lack of understanding.Footnote112

Similarly, students generally agreed (93.3%, 91.7% and 86.1%) that the assessment improved their critical reading skills. However, one student who disagreed in C2(0.67) explained that the nature of the assessment means the MCQs need to align with the author of the reading’s perspective, such that the assessment lacked scope for subjective critical analysis.

Students were asked what other skills the assessment helped them to demonstrate or develop. The coded open-ended responses indicate that many students mentioned comprehension and understanding. In C1(1.00), in which readings tended to be short sources, around half of the students mentioned comprehension and understanding; and in C2(0.67) and C3(0.33), in which readings tended to be one source or a section of one source, around a quarter of students mentioned comprehension and understanding. Students may have felt that multiple-source options encouraged greater comprehension and understanding than single-source options, even when the teacher ensures the number of pages for each option is approximately the same.Footnote113 In response, the author has increased the ratio of multiple-source options to single-source options in more recent iterations of the quiz design assessment.

The coded open-ended responses indicate that some students believed the assessment helped them to consider the perspectives of teachers and learners. Such a response demonstrates that MCQ authoring supports co-creation as students are collaborating with the teacher to design assessments for other students to take in that course and future iterations of the course.Footnote114 It is not only other students who benefit from co-creation. The open-ended responses suggest that co-creation prompted some students to critically reflect on the purpose of the quiz design assessment, which could open the door to several insights about what skills are important for a law graduate, such as close reading, knowledge distillation and collaboration.

Students were also asked how the assessment could be improved in future to help students demonstrate or develop skills. The coded open-ended responses indicate some students believed more guidance could be provided, with one student suggesting a walkthrough guide.Footnote115 Some students in C2(0.67) and C3(0.33), for which the reading was one source or a section of one source, noted the lack of substance in their reading and suggested that multiple sources be included in each reading. Further, one student in C2(0.67) suggested that the template could be more flexible to allow for the inclusion of diagrams or images, which PeerWise permits. Before permitting flexibility, teachers should check whether their learning management system’s quiz tool allows for diagrams or images.

Influence on behaviours

Students were asked how the assessment influenced how they would approach other readings in the future. Students generally agreed (80.0%, 62.5% and 75.0%) that the assessment made them more likely to closely read other course readings in the course. The coded open-ended responses indicate that many students would seek a deeper comprehension and understanding, and only a few students indicated that the quiz design assessment would not influence their approach to readings in the future. Students also generally agreed (86.7%, 83.3% and 94.4%) that taking other students’ quizzes would help their understanding of the course readings.

Students generally agreed (80.0%, 79.2% and 75.0%) that the assessment made them want to take other students’ quizzes. However, one student in each of C1(1.00) and C3(0.33) strongly disagreed.

Students were asked whether they were actively mindful while authoring their MCQ quiz that the quiz would be shared with other students. Many students (75.0%, 64.7% and 78.3%) reported being actively mindful that their quiz would be shared with other students, whereas a smaller but sizable percentage (25.0%, 35.3% and 21.7%) reported that they were not.Footnote116 The assessment incorporates contributing student pedagogy and so is intended to encourage students to consider other students’ perspectives. The coded open-ended responses indicate that designing content for other students motivated some students to produce quality work.Footnote117

However, despite being told that the anonymised quizzes may be made available for other students to take in that course and future iterations of the course, some students did not register that their quiz might be shared. Some interesting inferences can be made from these students’ open-ended responses. First, they did not put in as much effort as they might have if they realised the quiz would be shared. This indicates that there is a community responsibility motivating some students’ work on their MCQ quizzes. Second, they assumed other students did not realise that the MCQ quizzes might be shared and did not put in as much effort, such that other students’ quizzes would not be good quality. Thus, they were unsure whether taking other students’ quizzes would be worthwhile.Footnote118

Ultimately, students generally agreed (80.0%, 70.8% and 75.0%) that the assessment being one of the assessments for the course increased their desire to take the course.

Flexibility

The coded open-ended responses indicate what the students liked and disliked about the assessment. Some students liked that the assessment provided greater flexibility than other forms of assessment and encouraged creativity.Footnote119 Also, some students liked that the assessment was novel and provided a change from essay and problem answeringFootnote120 – the signature pedagogical practices of the law school.Footnote121 However, the novelty of the assessment, including a lack of prior experience with the form of assessment, presented a challenge for some students.

Students were asked whether there should be more variety in the forms of assessment in the LLB. Many students strongly agreed (73.3%, 62.5% and 88.9%), and most students generally agreed (93.3%, 95.8% and 97.2%), that there should be more variety in the forms of assessment in the LLB.

Some studies also liked that the assessment was a lower stakes assessment at 10% of the course assessment, contending that smaller, frequent assessments were more effective than longer, less frequent assessments. However, some students in C1(1.00) and C2(0.67) believed the assessment should have been worth more than 10% of the course assessment.Footnote122 Interestingly, when the number of MCQs was reduced to five in C1(1.00), no students believed that the assessment should have been worth more than 10% of the course assessment, indicating an appropriate balance between workload and course credit.

Students generally agreed (93.3%, 95.8% and 94.4%) that they liked it when an assessment allowed them more choice or flexibility in approaching it. This reflects themes in the literature that MCQ authoring is considered novel and innovative for many students.Footnote123 Students generally agreed (86.7%, 95.8% and 91.7%) that the quiz design assessment allowed them more choice about what subject to write their assessment on than for a typical assessment at law school. However, some students considered the scope of the assessment too narrow or limiting. The teacher must ensure that the reading options allow students to engage with the course materials deeply.

Ultimately, even if the coded open-ended responses indicate some constructive criticism, students generally agreed (93.3%, 95.8% and 100.0%) that the assessment was valuable for their learning; no student generally disagreed. The percentage was similar for each course, suggesting that the perceived value of the assessment was not affected by the workload-to-course points ratio.

Conclusion

Overall, students responded positively to the assessment. First, on the interest, engagement, challenge and enjoyment generated by the assessment, students generally agreed that it increased their interest in the content of the reading for which they authored a MCQ quiz; increased their engagement with the course materials generally; was challenging; and was more enjoyable than working on a typical research essay or a typical problem question. Second, on the time and work involved, students generally agreed that the time required was on a par with or less than that required for a typical law assessment; and the number of MCQs they needed to write was fair given the assessment weighting and course points value. Third, on the directions and guidance provided, students generally agreed that the assessment instructions and marking rubric were clear; and the provision of the two example quizzes authored by previous students was helpful. Fourth, on how the assessment helped them to learn and develop, students generally agreed that the assessment gave them a good opportunity to demonstrate an understanding of their reading; and the assessment improved their critical reading skills. Fifth, on how the assessment influenced their behaviours, students generally agreed that the assessment made them more likely to closely read other course readings in the course; the assessment made them want to take other students’ quizzes; and the inclusion of the assessment in the course increased their desire to take the course. Finally, on the flexibility of the assessment, students generally agreed that they had more choice in what questions or topics to write their quiz design assessment on than for a typical assessment at law school. Ultimately, students generally agreed that the assessment was valuable for their learning. Accordingly, the study endorses the use of quiz design as assessment in non-STEM disciplines like law.

Teachers looking to trial quiz design as an assessment in their courses should consider how many sources and pages comprise each reading, how many MCQs are required and how much the assessment is worth. This study endorses a five-MCQ quiz for 10% on a reading, comprising multiple sources or a varied source, in a course requiring 150 hours of learning. Teachers should also note that the novelty of the assessment, while appreciated by many students, presents a challenge for some students. Teachers should front-foot this by providing clear instructions, highlighting that the students’ quizzes may be shared with other students, and sharing exemplar MCQ quizzes for each type of source on which students may write a MCQ quiz.

Acknowledgements

The author thanks Oriel Kelly for her advice and Jason Coates, Nick Stewart, Sian Vaughan-Jones, Arela Jiang and Lauren Millington for their research assistance. The author also thanks the students who participated in the study. The results are rounded to one decimal place. Any errors remain my own.

Disclosure statement

No potential conflict of interest was reported by the author.

Notes

1 The three courses followed a trial in semester 2, 2020, which was before the author decided to study the assessment formally. That trial helped the author to refine the assessment before it was studied formally in the three courses in 2021–2022.

2 In doing so, it introduces PeerWise, an online tool, which is the closest comparator for the quiz design assessment: “Information for Instructors” (PeerWise) <https://peerwise.cs.auckland.ac.nz/docs/instructors> accessed 1 December 2023.

3 In the law context, students have suggested that tests, legal essays and memoranda are less engaging than authentic assessments like submissions to an entity, such as a tribunal or parliamentary select committee. See Dianne Thurab-Nkhosi, Gwendoline Williams and Maria Mason-Roberts, “Achieving Confidence in Competencies through Authentic Assessment” (2018) 37 Journal of Management Development 652, 653–54; and Toni Collins, “Authentic Assessment – The Right Choice for Students Studying Law?” (2022) 32 Legal Education Review 1, 8, 10, 13. See also Caroline Hart and others, “The Real Deal: Using Authentic Assessment to Promote Student Engagement in the First and Second Years of a Regional Law Program” (2011) 21 Legal Education Review 97, 116.

4 Christina Donnelly, “The Use of Case Based Multiple Choice Questions for Assessing Large Group Teaching: Implications on Student’s Learning” (2014) 3(1) Irish Journal of Academic Practice 1, 5.

5 Wendy McKenzie and John Roodenburg, “Using PeerWise to Develop a Contributing Student Pedagogy for Postgraduate Psychology” (2017) 33(1) Australasian Journal of Educational Technology 32, 33.

6 Stephen W Draper, “Catalytic Assessment: Understanding How MCQs and EVS Can Foster Deep Learning” (2009) 40 British Journal of Educational Technology 285, 285.

7 Paul Denny, Andrew Luxton-Reilly and John Hamer, “The PeerWise System of Student Contributed Assessment Questions” (paper presented to Tenth Australasian Computing Education Conference, Wollongong, NSW, Australia, January 2008) 69.

8 See Draper (n 6) 289. See also Steven Bottomley and Paul Denny, “A Participatory Learning Approach to Biochemistry Using Student Authored and Evaluated Multiple-Choice Questions” (2011) 39 Biochemistry and Molecular Biology Education 352, 357.

9 Rebecca Grainger and others, “Medical Students Create Multiple-Choice Questions for Learning in Pathology Education: A Pilot Study” (2018) 18 BMC Medical Education, Article 201, 1–2.

10 ibid 2.

11 Andrew Luxton-Reilly and Paul Denny, “Constructive Evaluation: A Pedagogy of Student-Contributed Assessment” (2010) 20 Computer Science Education 145, 151.

12 Josh B Kurtz and others, “Creating Assessments as an Active Learning Strategy: What Are Students’ Perceptions? A Mixed Methods Study” (2019) 24(1) Medical Education Online, Article 1630239.

13 Elaine Doyle and Patrick Buckley, “The Impact of Co-Creation: An Analysis of the Effectiveness of Student Authored Multiple Choice Questions on Achievement of Learning Outcomes” (2022) 30 Interactive Learning Environments 1726, 1727.

14 Tracey Papinczak and others, “Using Student-Generated Questions for Student-Centred Assessment” (2012) 37 Assessment & Evaluation in Higher Education 439, 440.

15 Katrina Falkner and Nickolas JG Falkner, “Supporting and Structuring ‘Contributing Student Pedagogy’ in Computer Science Curricula” (2012) 22 Computer Science Education 413, 414.

16 John Hamer and others, “Contributing Student Pedagogy” (2008) 40 ACM SIGCSE Bulletin 194, 195.

17 Falkner and Falkner (n 15) 414. See also Luxton-Reilly and Denny (n 11) 148–49.

18 Hamer and others (n 16) 195.

19 Falkner and Falkner (n 15) 416.

20 Denny, Luxton-Reilly and Hamer (n 7) 71.

21 ibid 72.

22 ibid 71.

23 STEM stands for science, technology, engineering and mathematics.

24 See generally “List of Publications Relating to PeerWise” (PeerWise) <https://peerwise.cs.auckland.ac.nz/docs/publications> accessed 1 December 2023.

25 Paul Denny and others, “PeerWise: Students Sharing Their Multiple Choice Questions” (paper presented to Fourth International Workshop on Computing Education Research, Sydney, NSW, Australia, September 2008) 1.

26 ”Information for Instructors” (n 2).

27 ibid.

28 ibid.

29 “PeerWise – Collaborative Student Learning” (YouTube, 4 August 2010) <www.youtube.com/watch?v=j1tN006KEWo> accessed 1 December 2023.

30 “A Guide for Students” (PeerWise) <https://peerwise.cs.auckland.ac.nz/docs/students> accessed 1 December 2023.

31 “Information for Instructors” (n 2).

32 “A Guide for Students” (n 30).

33 “Information for Instructors” (n 2).

34 ibid.

35 “PeerWise – Collaborative Student Learning” (n 29). See “An Introduction in 90 Seconds” (YouTube, 24 July 2014) <www.youtube.com/watch?v=x3WEJw26nkE> accessed 1 December 2023.

36 Students are advised they can improve their score by: authoring relevant, high-quality MCQs with well-considered alternatives and clear explanations; answering MCQs thoughtfully; rating MCQs fairly and leaving constructive feedback; and using PeerWise early, as the score increases over time based on the student’s contribution history. “A Guide for Students” (n 30). See Paul Denny, “Scoring: for Fun and Extra Credit!” (PeerWise) <https://peerwise.cs.auckland.ac.nz/docs/community/scoring_for_fun_and_extra_credit> accessed 1 December 2023. See also “Community Resources” (PeerWise) <https://peerwise.cs.auckland.ac.nz/docs/community/resources> accessed 1 December 2023.

37 “PeerWise – Collaborative Student Learning” (n 29).

38 Foong May Yeong, Cheen Fei Chin and Aik Ling Tan, “Use of a Competency Framework to Explore the Benefits of Student-Generated Multiple-Choice Questions (MCQs) on Student Engagement” (2020) 15 Pedagogies: An International Journal 83, 98.

39 Bottomley and Denny (n 8) 356.

40 Grainger and others (n 9) 4–6.

41 Kurtz and others (n 12) 6.

42 ibid 7.

43 ibid 7.

44 Clare Guilding and others, “Answering Questions in a Co-Created Formative Exam Question Bank Improves Summative Exam Performance, While Students Perceive Benefits from Answering, Authoring, and Peer Discussion: A Mixed Methods Analysis of PeerWise” (2021) 9(4) Pharmacology Research & Perspectives, Article e00833, 8.

45 ibid 7.

46 Papinczak and others (n 14) 444.

47 HA McQueen and others, “PeerWise Provides Significant Academic Benefits to Biological Science Students Across Diverse Learning Tasks, But with Minimal Instructor Intervention” (2014) 42 Biochemistry and Molecular Biology Education 371, 377.

48 The researchers in that study note that negative responses were “almost always connected to the peer-dependent element” of the rating feature; and that concerns about peer dependence could be exacerbated in competitive class environments, indicating that some students might unfairly rate their peers: ibid 377. The researchers in that study did not expressly link the negative responses to findings in the broader literature that students value peer feedback less than teacher feedback. However, this seems plausible – for example, because students distrust that their peers have the requisite knowledge to identify issues and the ability to provide high-quality constructive feedback. See, for example, Ngar-Fun Liu and David Carless, “Peer Feedback: The Learning Element of Peer Assessment” (2006) 11 Teaching in Higher Education 279; Thu Thuy Vu and Gloria Dall’Alba, “Students’ Experience of Peer Assessment in a Professional Course” (2007) 32 Assessment & Evaluation in Higher Education 541; and Qiyun Zhu and David Carless, “Dialogue within Peer Feedback Processes: Clarification and Negotiation of Meaning” (2018) 37 Higher Education Research and Development 883.

49 McQueen and others (n 47) 377.

50 Judy Hardy and others, “Student-Generated Content: Enhancing Learning through Sharing Multiple-Choice Questions” (2014) 36 International Journal of Science Education 2180, 2191. There were about 988 total students across the five courses; the 854 students were the “PeerWise active” students.

51 ibid 2192.

52 ibid 2192.

53 Bottomley and Denny (n 8) 357.

54 McQueen and others (n 47) 374.

55 ibid 375.

56 ibid 379.

57 See ibid 379. The students were also provided introductory sessions on PeerWise and extra support sessions on MCQ authoring.

58 ibid 379.

59 Denny and others (n 25) 5–6.

60 Bottomley and Denny (n 8) 355.

61 ibid 357.

62 Yeong, Chin and Tan (n 38) 89.

63 Papinczak and others (n 14) 445.

64 Bottomley and Denny (n 8) 354 and 355–56. See BS Bloom and others, Taxonomy of Educational Objectives: The Classification of Educational Goals – Handbook I: Cognitive Domain (David McKay 1956). Bloom’s original taxonomy comprises six categories of educational goals, ordered from simple to complex: knowledge, comprehension, application, analysis, synthesis and evaluation. Lower-order thinking typically occurs in the knowledge, comprehension and application categories. See also David R Krathwohl, “A Revision of Bloom’s Taxonomy: An Overview” (2002) 41 Theory into Practice 212 for the revised taxonomy, which recasts the categories as cognitive processes (remember, understand, apply, analyse, evaluate and create), in which remembering, understanding and applying are the lower-order thinking skills.

65 Bottomley and Denny (n 8) 356.

66 Yeong, Chin and Tan (n 38) 101.

67 For example, “All of the above” should be rephrased as “All of the options”.

68 For example, “The Interpretation Act provides that: … (d) [All of the options]”.

69 See Yeong, Chin and Tan (n 38) 98 for a discussion of student engagement with materials.

70 See Denny, Luxton-Reilly and Hamer (n 7) 71.

71 See Papinczak and others (n 14) 445 for a discussion of the higher cognitive levels of thinking that MCQ quiz authoring can facilitate.

72 Students in the three courses studied for this article were provided with the same examples of MCQ quizzes. See n 1.

73 So far, no students have opted out.

74 See McQueen and others (n 47) 377.

75 Many law students resist collaborative and peer assessment. See, for example, Clifford S Zimmerman, “Thinking beyond My Own Interpretation: Reflections on Collaborative and Cooperative Learning Theory in the Law School Curriculum” (1999) 31 Arizona State Law Journal 957, 965, 971–75, 982–85. But universities are increasingly encouraging “assessment for learning”, including collaborative and peer assessment. See, for example, “Collaborative and Peer Assessment” (TeachWell, 23 March 2023) <https://teachwell.auckland.ac.nz/signature-pedagogies/assessment-for-learning/collaborative-assessment> accessed 1 December 2023. The author acknowledges that omitting a peer rating component shirks the uncomfortable inevitable.

76 See, for example, Lynne Taylor and others, “The Making of Lawyers: Expectations and Experiences of Sixth Year Aotearoa/New Zealand Law Students and Recent Law Graduates” (Ako Aotearoa 2020) at 68–70. Several of those final year law students and recent law graduates surveyed reflected on law school as “competitive” or “highly competitive”.

77 To be admitted and enrolled as a barrister and solicitor in New Zealand, persons require a legal qualification approved by the New Zealand Council of Legal Education (NZCLE). The Bachelor of Laws (LLB) is currently the only approved legal qualification available at New Zealand law schools. Overseas legal qualifications can be recognised for admission. However, persons with an overseas legal qualification need to demonstrate competence in the required courses for the LLB degree and may be required to sit a New Zealand Law and Practice Examination for one or more required courses. New Zealand Council of Legal Education, Professional Examinations in Law Regulations 2008 (2017); and New Zealand Council of Legal Education, New Zealand Law and Practice Examination: Information for the July 2023 Examinations (2023). See also Lawyers and Conveyancers Act 2006.

78 The LLB degree at the University of Auckland has four stages. In stage one, students study Law and Society, Legal Method and Legal Foundations, as well as electives from another degree programme. In stage two, students study Criminal Law, Public Law, the Law of Torts and the Law of Contract, as well as Legal Research, Writing and Communication. In stage three, students study Land Law, Equity and Jurisprudence, as well as law electives. In stage four, students study Legal Ethics, as well as Advanced Legal Research, Writing and Communication, and law electives. Each stage equates to about a year of full-time study for students studying an LLB degree only. For conjoint students, stage two is usually spread over two years to allow students to progress their other degree simultaneously. Students normally take electives in their third or fourth year of a pure law degree or fourth or fifth year of a conjoint degree. A conjoint degree at the University of Auckland is when students study for two bachelor’s degrees at the same time but do not need to take as many courses in each bachelor’s degree as students studying those bachelor’s degrees separately. The LLB degree on its own is a 480-point bachelor’s degree, which can be completed in four years. By contrast, BA and BCom bachelor’s degrees on their own are 360 points, which can be completed in three years. BA/LLB and BCom/LLB conjoint degrees are each 675 points, and can be completed in five years. The entry requirements for a conjoint degree are higher than those for a bachelor’s degree.

79 It is more efficient to eyeball numbers of pages and font sizes than to calculate word counts, particularly when sources like pdfs are not always machine-readable.

80 So, if the assessment required students to author a 10-MCQ quiz, the reading options would be approximately 25 pages, with some flexibility – for example, to ensure natural starts and ends to sources (such as at headings) and depending on content density and font sizes. The 2.5 times rule seemed to give students enough material (about 2.5 pages) per MCQ, while ensuring that most of the reading option was tested in the quiz, which would enhance the quiz as a learning resource on those readings (when made available for other students to take in that course and future iterations of the course).

81 A part-composition or composite-composition option may be complemented by one or more other part-composition or composite-composition options covering the remainder of the incomplete source or sources, to enable the generation of a quiz spanning entire sources. The sources in multiple-reading options should be coherent to ensure each student has a coherent quiz-authoring experience.

82 Footnotes 83–95 provide an option example for each category. These examples are from C1(1.00) Contemporary Issues in Land Law.

83 See, for example, Nathan v Dollars & Sense Ltd [2008] NZSC 20, [2008] 2 NZLR 557.

84 See, for example, Jayden Houghton, “Immediate Indefeasibility with Transactional Uncertainty” (2018) 28 New Zealand Universities Law Review 261.

85 See, for example, Jones v Morgan [2001] EWCA Civ 995 [1]–[74].

86 See, for example, Peter Birks, “Before We Begin: Five Keys to Land Law” in Susan Bright and John Dewar (eds), Land Law: Themes and Perspectives (Oxford University Press 1998) 457, 457–83.

87 See, for example, Gibbs v Messer [1891] AC 248 (PC); and Frazer v Walker [1967] NZLR 1069 (PC).

88 See, for example, Don McMorland, “Land Covenants” (2015) 16 BCB 158; Don McMorland, “Freehold Covenants” (2019) 18 BCB 121; Don McMorland, “Freehold Covenants” (2019) 18 BCB 135; Don McMorland, “Green Growth in a Wider Context” [2019] New Zealand Law Journal 168; Don McMorland, “Recreation Easements” (2019) 18 BCB 219; and Don McMorland, “The Legal Nature of Easements” (2020) 19 BCB 11.

89 See, for example, Downsview Nominees Ltd v First City Corp Ltd [1993] 1 NZLR 513 (PC); and Peter Devonshire, “The Mortgagee’s Duty on Exercising the Power of Sale” (1993) 6 BCB 161.

90 See, for example, Mabo v Queensland (No 2) (1992) 175 CLR 1, (1992) 107 ALR 1 (HCA) 1–20; and Attorney-General v Ngati Apa [2003] 3 NZLR 643 (CA) [183]–[216].

91 See, for example, Andrew Erueti, “Māori Customary Law and Land Tenure: An Analysis” in Richard Boast and others (eds), Māori Land Law (2nd edn, LexisNexis 2004) 41, 41–55; and John Locke, “The Second Treatise: An Essay Concerning the True Original, Extent, and End of Civil Government” in Ian Shapiro (ed), Two Treatises of Government and A Letter Concerning Toleration (Yale University Press 2003) 100, [26]–[45].

92 See, for example, Westpac New Zealand Ltd v Clark [2009] NZSC 73, [2010] 1 NZLR 82 at 82, [1]–[53]; and Katherine Sanders, “Land Law” [2012] New Zealand Law Review 545, 545–53.

93 See, for example, Bernstein of Leigh (Baron) v Skyviews & General Ltd [1978] 1 QB 479; and Bocardo SA v Star Energy UK Onshore Ltd [2010] UKSC 35, [2010] 3 WLR 654, 654–67.

94 See, for example, David Grinlinton, “The Registrar’s Powers in the Digital Age” in David Grinlinton and Rod Thomas (eds), Land Registration and Title Security in the Digital Age: New Horizons for Torrens (Routledge 2020) 148; and David Grinlinton, “The Registrar’s Powers of Correction” in David Grinlinton (ed), Torrens in the Twenty-first Century (LexisNexis 2003) 217, 217–27.

95 See, for example, Land Transfer Act 2017, ss 51–57; Jayden Houghton, “Land Transfer Act 2017” (2018) 24 Auckland University Law Review 317, 317–31; and Mau Whenua Inc v Shelly Bay Investments Ltd [2019] NZHC 3222, (2019) 20 NZCPR 923.

96 The C1(1.00) assessment schedule was: 10-question quiz design (10%); 750-word essay plan 1 (5%); 750-word peer review 1 (5%); mid-course test (10%); 750-word essay plan 2 (5%); 750-word peer review 2 (5%); and 4000-word final essay (60%). All assessments were mandatory.

97 The C2(0.67) assessment schedule was: 10-question quiz design (10%); online discussion contributions (15%); online discussion peer reviews (5%); and 4000-word take-home exam (70%). All assessments were mandatory.

98 The C3(0.33) assessment schedule was: 5-question quiz design (10%); online pre-lecture activities (10%); online discussion contributions (15%); online discussion peer reviews (5%); and 3500-word final essay (60%). All assessments were mandatory.

99 See n 1.

100 Compare Grainger and others (n 9) 4–6; and Kurtz and others (n 12) 7.

101 See Peter Roberts, “The Future of the University: Reflections from New Zealand” (1999) 45 International Review of Education 65. See also Rajani Naidoo and Geoff Whitty, “Students as Consumers: Commodifying or Democratising Learning?” (2013) 2 International Journal of Chinese Education 212.

102 The study was approved by the University of Auckland Human Participants Ethics Committee on 24 March 2021 for three years (reference number 22165).

103 Compare Yeong, Chin and Tan (n 38) 98.

104 See generally Grainger and others (n 9).

105 ibid 6.

106 See Papinczak and others (n 14); and Kurtz and others (n 12).

107 The markedly lower percentage for C1(1.00) could be explained by that course being focused on helping students develop a quality research essay. Students were asked to write a plan for an essay related to module one and receive feedforward from the teacher and their peers, write a plan for an essay related to module two and receive feedforward from the teacher and their peers, and then develop one of the essay plans into their final essay.

108 See Guilding and others (n 44); Papinczak and others (n 14); and Kurtz and others (n 12) 6.

109 Students were asked to type the answer in an open-ended box rather than respond to a Likert scale. Some students (13.3%, 20.8% and 2.8%) did not respond to the question. The percentages in are percentages of the total number of students in the course who answered the question.

110 Students were asked to type the answer in an open-ended box rather than respond to a Likert scale. Some students (33.3%, 25.0% and 30.6%) did not answer this follow-on question. The percentages in this paragraph are percentages of the total number of students in the course who answered the question.

111 See Kurtz and others (n 12) 6.

112 See Luxton-Reilly and Denny (n 11) 151.

113 See above n 79.

114 Compare Doyle and Buckley (n 13).

115 Compare Kurtz and others (n 12): students completed two online modules before undertaking the MCQ quiz assessment.

116 Students were asked to type the answer in an open-ended box rather than respond to a Likert scale. Many students (46.7%, 29.2% and 36.1%) did not respond to the question. The percentages in this paragraph are percentages of the total number of students in the course who answered the question.

117 Compare Falkner and Falkner (n 15) 416.

118 At least, that is what these students believed. The author can confirm that most quizzes were of excellent quality and, in any case, the author either improved or did not use the lower-quality quizzes. Only worthwhile quizzes would be available for students to take. In any case, these comments emphasise the importance of making it clear to students that the quizzes may be made available to other students, which may trigger a sense of community responsibility and motivate students to author worthwhile quizzes.

119 Compare Draper (n 6) 289.

120 Compare Guilding and others (n 44) 7: students welcome diversity in assessment forms.

121 Lee S Shulman, “Signature Pedagogies in the Professions” (2005) 134(3) Daedalus 52.

122 Compare Guilding and others (n 44) 9: despite the activity being optional, many students still participated in MCQ authoring.

123 See Bottomley and Denny (n 8) 356; and Guilding and others (n 44) 7.