CDTL    Publications     Mailing List     About Brief



In this issue of CDTL Brief on Learning with Technology, the authors discuss how to use some Integrated Virtual Learning Environment (IVLE) features to enhance students’ learning and understanding of the subject.

May 2005, Vol. 8, No. 3 Print Ready ArticlePrint-Ready
Root Questions for Large Classes
Dr Stephane Bressan
School of Computing, NUS
Professor Jeffrey D. Ullman
Stanford University and Gradiance Corp.

While lecturers teaching large undergraduate classes often face the critical problem of efficient and effective continuous training and assessment, undergraduate students in the School of Computing at NUS struggle with a busy timetable, juggling numerous deadlines for homework, assignments and other projects. Hence, their lecturers need to look for valid and reliable continuous training and assessment strategies that can be marked quickly and provide immediate feedback.

Potentially, some ICT tools can automate part of the training and assessment strategies, make the process efficient for lecturers and offer flexibility to the students. Ang (2004) recognises that the use of ICT tools can help overcome many of the training and assessment problems associated with large classes. Yet Ang (2004) also acknowledges that most existing courseware management systems do not provide effective tools for online continuous training and assessment, especially for large classes.

The Integrated Virtual Learning Environment (IVLE) in NUS is a fully-integrated set of high quality management and communications tools. Features such as discussion forums and feedback surveys are excellent teaching and learning aids even for large classes. IVLE includes in particular, an online assessment tool that comes with seven different generic types of questions: ‘Multiple Choice’, ‘Multiple Response’, ‘Select List’, ‘True or False’, ‘Fill in the Blank’, ‘Matching’ and ‘Essay’ questions. All but the last type of question can be marked automatically. Unfortunately most of these types of questions are better suited for summative forms of assessment than formative ones although there are creative strategies to use these types of questions for effective formative assessment (see for instance, Zubair & Khoo, 2003).

Online assessment with root questions* was introduced during Semester 1 of Academic Year 2004/2005 for the module CS2102 “Database Systems”. The designed and proposed root questions covered topics such as relational calculus, theory of functional dependencies and normalisation of relational designs. Since the main objective of the assessment was formative, students were given sufficient time for multiple attempts. Students generally found this form of assessment more flexible than the traditional assignment.

A root question is a multiple choice question that has several right answers and many wrong answers. It comprises a stem, a few correct choices, several incorrect choices, a solution and choice explanations. The stem is a statement of the problem presented to the student. The student is asked to identify the set of solutions to the problem from a list comprising a few correct choices and several incorrect choices. The list is generated every time the student attempts to answer the question. The incorrect choices are usually designed to reflect typical mistakes with explanations to help clarify the student’s doubts. Alternatively, the explanations can also be replaced by hints. The solution to the problem with an explanation is presented to the student when the homework deadline is reached.

The student will see a different set of choices every time he/she attempts the question. Thanks to the choice explanations and hints, the student quickly realises that the best strategy is to solve the problem given by the stem rather than trying to guess the correct choice. In addition, he/she can learn from making incorrect choices. Such a learning process transforms a standard multiple choice question into a problem-based learning experience. Thus, root questions not only allow for the process of assigning and grading to be automated, it also enables a problem-based approach to the assessment.

To illustrate the notion of root question, let us consider a simple example in the domain of integral calculus. Our goal is to make sure students understand the rule for integrating polynomials (e.g. ∫xn = xn+1 / (n + 1)). A conventional problem would be something like:

Compute the indefinite integral of 20x4 + 12x3 + 30x2.

To turn this question into a root question, we need to observe that there are three terms to this polynomial and so there are three natural components that lead to three correct choices. If we wanted more choices, we could add more terms to the polynomial. Thus, we can phrase the root question as:

Compute the indefinite integral of 20x4 + 12x3 + 30x2. Then, identify one of the terms in the integral from the list below.

Since there are three correct choices in this example: 4x5, 3x4 and 10x3, we should develop approximately nine incorrect choices by using some common mistakes that students might make, and with explanations. In this example, one common mistake students often make is to forget to divide by n + 1. This mistake leads to the incorrect choices of 20x5, 12x4 and 30x3. Another possible mistake is to divide by n instead of n + 1 and thus students might choose 5x5, 4x4, and 15x3. We could proceed by theorising about other possible errors, or just add some random, plausible looking incorrect choices such as 3x3, 60x, 80x5 and 6x3. We now have three correct and 10 incorrect choices to make a reasonable combination.

The explanation will point out the mistake for incorrect choices designed to detect specific mistakes. For example, the explanation associated with an incorrect choice, 20x5, 12x4 and 30x3, might say:

The correct rule for integrating polynomials requires that we divide the term by a constant. Do you remember how that constant is determined?

For the incorrect choice, 5x5, 4x4 and 15x3, the explanation might say something similar:

The correct rule for integrating polynomials requires that we divide the term by a constant. However, you may have chosen the wrong constant.

These explanations jog the students’ memory and help those who have already understood the idea but were careless. Despite these explanations, some students might still feel lost. Hence, we might choose to attach to the remaining four incorrect choices more explicit advice such as:

In order to integrate a polynomial, we integrate each term and sum the results. The rule for integrating a term is ∫axn = axn+1 / (n + 1).

Interestingly, during the assessment period, groups of students spontaneously started to discuss the questions on the module’s IVLE discussion forum. Such discussions should be encouraged as they play a similar role to the one of the choice explanations.


Ang, K.K. (2004). ‘Teaching a Very Large Class: What to do? How?’ CDTL Brief, September 2004, Vol. 7. No. 8.

Ullman, J.D. (2005). ‘Gradiance On-Line Accelerated Learning’, in Proceedings Twenty-Eighth Australasian Computer Science Conference (ACSC2005), Newcastle, Australia. CRPIT, Estivill-Castro, V., (Ed.). ACS, Vol. 38, pp. 3–6.

Zubair, A. & Khoo, H.E. (2003). Basics in Medical Education. World Scientific Publishing, Singapore.


*The examples of root questions mentioned in this article are taken from Ullman (2005). Further discussion on the design of root questions and more sophisticated examples are available in Ullman’s paper. The Gradiance system ( currently offers instructors and their students a free online access to a bank of root questions in the areas of databases, compilers, automata and language theory, and operating systems. Instructors can request access to the system and its bank by sending an email to . Instructors who wish to contribute root questions in these or other domains can send their contact to the Gradiance team at the same email address.

 First Look articles

Search in
Email the Editor
Inside this issue
Root Questions for Large Classes
Using Online Discussion Forum in Learning Mathematics
Technology-mediated Learning — The case for Macromedia Breeze and IVLE Tools