Multiple Choice Questions


Sample Cases from Dublin Institute of Technology

See some samples cases for this assessment method in the table below or browse the full set of cases in the assessment toolkit.

Assessment MethodLecturerAssociated Programme(s)NFQ LevelYear
Fortnightly Quiz Patricia O'Byrne BSc Computing
BSc Computer Science
Level 8 Year 2-3
Multiple Choice Questions Art Sloan BSc Computer Science Level 8 Year 1
Online Quiz Thomas Freir BTech Networking Technologies
BEngTech Sustainable Design in Electrical Services Engineering
Higher Certificate Electrical Services Engineering
Levels 6, 7, 8 Years 1-3
Online Quizzes David Dorran BE Electrical / Electronic Engineering Level 8 Years 2, 3

[Return to the Assessment Toolkit]


Description

MCQs usually consist of short questions or stems with a limited number of possible answers. To complete the assessment, students only need to identify the answer they believe to be most suitable. At the end of the quiz, feedback should be provided to explain why the correct answer (the key) is most optimal, and why the incorrect answers (the distracters) are less suitable choices. Questions must be written unambiguously. The incorrect ‘distracter’ answers must be reasonably plausible to prevent students for artificially arriving at the correct answer because it is the only one which is plausible. Writing feedback is an important and time consuming consideration.

Uses for MCQ / Objective tests

Although generating question banks (with feedback) to accompany a module is time consuming, once completed, the questions can be used for a number of purposes-

  • after a topic to allow students measure their understanding
  • before a topic to prepare the students’ mind
  • as part of a formal assessment strategy.

Smaller ‘low-stakes’ quizzes can be used formatively throughout the module to assess individual learning outcomes, and the same questions can be used at the end of the module for the summative assessment for many learning outcomes in one assessment.

Advantages of (online) MCQs

  1. Can spend time designing rather than marking assessments
  2. Relative simplicity of inputting questions
  3. Computerised objective marking
  4. Provision of immediate grade to students automatically
  5. Provision of immediate feedback to students automatically
  6. Students can be given many opportunities to repeat a formative assessment, or take the assessment when they feel ‘ready’.
  7. Possibility of using textbook question banks from commercial publisher as part of textbook adoption agreements.
  8. Statistics generated can be useful to determine individual and whole class achievement
  9. Helps students acquire IT skills (may be important for mature students early in study)

Challenges of (online) MCQs

  • An element of chance can allow students tarrive at the correct answers and pass the assessment
    • Care must be taken in the design of questions and high quality distracter incorrect answers. Common misconceptions and errors are useful for distracters.
    • Using 4 or 5 possible answers reduces the element of chance t25% or 20% respectively. Any more than this will obviously reduce the chance of artificially passing further, however it requires finding even more ‘plausible incorrect distracters’ which can be difficult. This approach alsincreasing reading time and may be overly confusing for students. It is advised not thave more than 6 answers.
    • Negative marking can be used, whereby students receive a mark for correct answers, but lose a mark (or part of a mark) for incorrect answers. This reduces the likelihood of ‘wild guessing’, however it alsreduces the likelihood of students answering based on nearly certain knowledge. Negative marking should be carefully considered, as it can be seen as somewhat harsh and intimidatory 
  • More discursive and subjective disciplines are more difficult tassess, however there is a body of research which suggests that MCQs can test deeper understanding when used correctly (e.g. Beevers and Patterson, 2003)
  • Students require IT access and a certain level of IT skills. It is useful tset up a mock student account ttrial the assessment from a basic computer.
  • Assessment invigilating can be challenging and resource intensive
  • Technical advice, training and support required for tutors and students prior tassessment
  • Technical support required during high-stakes invigilated assessment in case of equipment or IT failure
  • Designing questions, and especially incorrect answers can be difficult 
    • Experience and previous assessments / exams can be a good source of incorrect answers that were plausible tstudents.
  • Care must be taken tensure the assessment matches the learning outcomes
    • This is especially important if using commercial or shared question banks

Setting up the assessment on a VLE

Questions can be written using the software available in the VLE (eg Blackboard, Webcourses). For short ‘low-stakes’ formative assessments, typically all students receive the same questions. While these can be invigilated, oftentimes students are allowed to access these for a longer period of time and from an unsupervised computer. They may also be given a number of attempts to reach a certain threshold in the assessment.

However if MCQs are used for a summative or ‘high-stakes’ assessment it can be useful to have a supervised assessment, more similar to a traditional examination. For summative assessments, questions in the question bank can be grouped by topic. The assessment can be set up to randomly pick a set number of questions from each topic, thus each student will receive a slightly different assessment. In addition, the answers can be randomised, so that that the correct answer may appear for example as the first choice for one student, but the third for another. This can make invigilation arrangements more straightforward if using a computer room with computers nearby to each-other, by reducing the potential for cheating. For additional security and to prevent outside access by students, the assessment can be password protected, with the password only given out in the examination room. Furthermore, the computers can be set up to ‘lock-down’ the internet browsers to prevent students accessing the internet or other applications during the assessment.

Resources

The following links provide useful resources for computer assisted assessment:

For general guidelines in MCQ assessments

  • Pritchett, N. ‘Effective Question Design’ in Brown et al. (1999), Computer Assisted Assessment in Higher Education
  • Freeman, R. and Lewis, R. Planning and Implementing Assessment, Chapter 14 Objective Testing

For examples of assessing higher order skills (eg thinking critically and making judgements):

  • Dunn et at. (2004) The Student Assessment Handbook, Chapter 17.

References (all available in DIT library)

  • Beevers, C., & Paterson, J. S. (2003). Automatic assessment of problem solving skills in mathematics, Active Learning in Higher Education,4, 127-145
  • Chesney, S. and Ginty, A. Computer Aided Assessment. In Developing Effective Assessment in Higher Education: A Practical Guide , Sue Bloxham and Pete Boyd
  • Dunn, L., Morgan, C., O'Reilly, M., & Parry, S. (2004) The Student Assessment Handbook: New Directions on Traditional and Online Assessment New York: RoutledgeFalmer.
  • Freeman, R., & Lewis, R. (1999). Planning and Implementing Assessment. London, UK: Kogan Page
  • Pritchett, N. ‘Effective Question Design’ in Brown, S., J. Bull and P. Race eds. (1999). Computer Assisted Assessment in Higher Education, London: Kogan Page.

Back to Top