CDTL    Publications     Mailing List     About Brief



This issue of CDTL Brief features three Teaching Enhancement Grant projects by colleagues from various departments and faculties.

April 2009, Vol. 12 No. 1 Print Ready ArticlePrint-Ready
Effectiveness of the Classroom Response System in Tutorials
Mr Hong Chong Ming, Kenneth and Ms Lam Poh Fong, Lydia
Department of Physics


We gathered quantitative feedback via a survey from students taking two physics general education modules (GEM) on their experience using the Classroom Response System (CRS) in their tutorials. The technology incorporates a handheld wireless device (also known as a clicker) that allows students to respond to multiple choice questions posted by the instructor during class. The responses sent by students are captured and analysed once the session ends. This article examines the effectiveness of such a system in the classroom from students’ perspectives.

Motivation for Using Clickers

It is common for instructors to be greeted with resounding silence after they have posed a question in class, hence not much can be gathered concerning students’ understanding of what is being taught. This may be because students are too shy to speak out in front of their classmates and want to be spared the embarrassment of giving the wrong answer, or they simply may not know the answer.

In the CRS, students respond to the instructors’ questions by pressing a particular button on the clicker that represents their choice. The battery-operated clicker is light, handy and durable. The responses are collected by two wireless infrared receivers mounted on the wall next to the screen. Instructors can time the session according to the level of difficulty of the question posted. At the end of the session, charts displaying the percentages which correspond to the various choices selected will be displayed. Both instructor and students will get to see how everybody responded to the questions.

CRS allows instructors to investigate students’ understanding of a topic that is being discussed in the class any time. Instructors could set multiple choice questions for students to answer individually or as a group after discussing amongst themselves. The responses, which will be displayed as a chart of the percentages of various options, give instructors an indication of the extent of students’ understanding of the subject matter. A low percentage of correct answers will indicate a low level of understanding of the topic.

The Survey

A survey was carried out over a two-week period to find out how students view the CRS. Two sets of data were collected from students taking two different GEMs. In the first module (Group A), students’ participation in class (including answering the clicker questions) was tied to their tutorial grading. In the second module (Group B), there was no such affiliation. A total of 534 students were polled (Group A: 246, Group B: 288).

The survey required students to answer seven questions with five options: A—Strongly Disagree, B—Disagree, C—Neutral, D—Agree, E—Strongly Agree.

Survey Results: What We Learnt

Figures 1–7 show the responses to the seven questions by both groups of students:

Figure 1. Results for Question 1 (I found the use of clickers distracting and unhelpful.)

91.1% of the class did not feel the clickers were distracting or unhelpful. This indicates an overall positive impression of using the CRS.

Figure 2. Results for Question 2 (Learning with clickers improves my understanding of the course content.)

78.9% of Group A felt the CRS improved their understanding of the course content, whereas only 23.9% of Group B students felt the same way. This disparity could be due largely to the quality of the clicker questions asked and whether they promote a deeper understanding of the subject matter. This is a strong indication to instructors to be aware that questions should be properly framed so as not to limit the potential impact CRS could bring to students’ learning.

Figure 3. Results for Question 3 (Using clickers promotes more focused discussion during class.)

Like the previous question, more students in Group A felt that the CRS promoted a more focused discussion during class compared to students in Group B (73.8% versus 50.9%). It is possible that students in Group B felt that the clicker questions were just checking their understanding and did not actually promote discussion. Thus instructors should be aware of the need to phrase clicker questions properly so that they lead to more focused class discussions.

80% of students deemed the CRS as being able to motivate their participation in class. This positive experience therefore benefits students by as they move from a passive to active mode of learning.

Figure 4. Results for Question 4 (Using clickers motivates my participation in class.)

10.2% of students from Group A found using the CRS too time-consuming compared to 24.5% of Group B. One explanation could be that tutorial grading is not tied to participation for Group B’s students, so their involvement does not bring them any immediate benefits. Moreover, a high percentage (76.1%) of students from Group B did not feel that using clickers improved their understanding of the course content. Hence to them, using the CRS may just mean having to spend more time answering questions.

Figure 5. Results for Question 5 (Using clickers in class is too time-consuming.)

Figure 6. Results for Question 6 (I had difficulties getting my clicker to work in class.)

On the average, about 27.4% of students had difficulty getting their clickers to work. This indicates a necessity to improve on the technology. An additional receiver may need to be installed to collect responses more effectively, especially from students who are not sitting near the existing receivers. Also, instructors should be aware when any clicker fails and replace it with one that works.

Figure 7. Results for Question 7 (I would like to use clickers in other modules/courses.)

On average, about 70.5% of students liked the idea of using clickers in other courses. The desire to use them in other classes suggests that students have a good perception of the CRS and would like to benefit from its use in other classes.


If the CRS is implemented properly, it could be used to measure what students know before instructions are given (pre-assessment) and test their understanding of what they have learnt (post-assessment). It is also a great tool to facilitate discussion, increase students’ retention of what is being taught and help them confront their mistakes. Depending on how instructors design their questions, the CRS allows greater interaction among students, thereby engaging them in the learning process. It is an effective instructional tool and given its potential benefits, support for introducing, training and assisting faculty with this new instructional technology should be encouraged.

 First Look articles

Search in
Email the Editor
Inside this issue
An Integrated Simulation
Problem-based Learning Activity
Effectiveness of the Classroom Response System in Tutorials
Real-time Feedback/
Teaching System:
Development and Applications