Karen Erhhart

Implementing Clickers, CTL Mini-grant Fall 2015

Instructor: Karen Ehrhart, Marketing

Summary: For her two sections of MGT 352, Human Resource Management, Professor Karen Ehrhart had two goals for her students: to increase their engagement in quality class discussions and to increase their learning of course material. She succeeded in meeting these goals through the use of clickers, which enabled her students to share anonymously personal experiences relevant to course content, and to test their understanding of the material through in-class “Check your knowledge” questions posed at strategic learning points. She found that not only did the clickers stimulate increased student participation in the class, but they also gave her valuable feedback regarding the students’ level of grasping course material before moving on to the next topic. The successful results of this innovation were revealed through positive student feedback and better performance on exams.

Final report

What I Did

I had two goals in mind as I integrated clickers into my course: First, I hoped to increase student engagement and the quality of discussion, while giving a voice to students who are less inclined to speak up in class. Second, I wanted to increase student learning.

One way in which I aimed to increase student engagement was by asking “Any experience?” questions. With these questions, I asked students about their past experiences with the material that we discussed in class. In past semesters, I would ask students to raise their hands to indicate their experience with certain topics, but the participation in such questions tended to be low to moderate. With the clickers, the level of participation was much higher (at least 80% of the students would respond), most likely at least in part because they could provide their answer anonymously.

I used clickers to increase student learning by asking “Check your knowledge” questions. These questions were similar to my exam questions but gave students the opportunity to make sure they understood the material as I covered it. During each class period, I paused at least once or twice to ask these questions and gauge whether students understood the material and were ready to move on to the next topic. I particularly targeted topics on which students’ performance on exams was relatively poor, including questions that required students to do their own calculations and arrive at an answer. This gave them the chance to test their understanding and it allowed me to check whether they grasped the material before moving on to the next topic.

In addition to using clickers for the “Any experience” and “Check your knowledge” questions, I also used them to administer in-class quizzes, which comprise approximately 12% of the students’ grade in the course. The use of clickers for these quizzes allowed me to save paper entirely. I did allow students to use a piece of paper if they forgot their device or if their device did not have enough battery power, but this happened rarely.

How It Went

I measured the success of the implementation in three ways: midterm feedback from students, end of semester course evaluations and student performance comparisons.

First, I collected feedback during the semester about students’ experiences with clickers. On these evaluations, I ask students to indicate what is going well and what they like about the class, as well as what suggestions they have for making things better. When I collected mid-term evaluations during the Fall 2015 semester, students provided mostly positive feedback with regard to clickers. The most common comments were that the clickers were “good,” students appreciated that the class was “interactive,” and students enjoyed having the “Check your knowledge” questions. The only negative comments were that the use of the clickers “makes it a bit easier to cheat” and that it would be nice to see everyone’s responses to the “Any experience?” questions.

Second, in the course evaluation form at the end of the semester, students are asked two open-ended questions: “What was the best thing about this course?” and “What suggestions do you have that would improve this course?” On the positive side, students’ responses included that they “liked” and “loved” the “Check your knowledge” questions. In terms of suggestions, one student mentioned having more time to complete the quizzes.

Finally, I evaluated the implementation of clickers by comparing students’ performance on quizzes and exams in Fall 2015 (when I used clickers) to performance in Spring 2015 (the semester before I started using clickers). I particularly expected student performance on topics that were more thoroughly discussed in class (e.g., through “Check your knowledge” questions) to be higher than student performance during the previous semester. For 15 of the “Check your knowledge” questions that I administered in class during Fall 2015, I identified corresponding exam questions that tested the same concept or information as in Spring 2015. Student performance was better for 10 of the 15 items during Fall 2015 compared to Spring 2015.

In addition, I compared students’ performance on the midterm exams that I administered during the Fall 2015 and Spring 2015 semesters. The average score across the 3 midterm exams in Fall 2015 was 79.6%, whereas the average score across the 3 midterm exams in Spring 2015 was 75.9%. In addition, the average score across the 11 quizzes in Fall 2015 was 88%, whereas the average score across the 7 quizzes in Spring 2015 was 84%.

My overall conclusion is that both student engagement and performance were stronger during the semester that I implemented clickers as compared to the prior semester.

What I Learned

My experience with implementing clickers was positive for the most part, and students seemed to like using the clickers to respond to the “Any experience?” and “Check your knowledge” questions. In addition, using clickers for the in-class quizzes allowed me to upload quiz grades directly into Blackboard, rather than hand-grading paper quizzes and then entering the grades. It only took a few minutes after class to indicate the correct answers to the quiz questions and upload the scores.

Although I liked the clickers and the results of the implementation were generally positive, there are several areas for improvement.

One challenge that I encountered upon implementing clickers involved learning the technology. The program and hardware were not particularly difficult to master, but there was a learning curve, and there were a few idiosyncrasies that I needed to learn to overcome:

  • Consistently syncing the roster with Blackboard throughout the semester.
  • Adjusting default clicker frequency due to issues with another class using the same frequency.
  • Displaying results by clicking onto separate window rather than using the shortcut key.

Another challenge that I experienced pertained to time management during class. Early in the Fall 2015 semester, I realized that adding the clicker-based questions to my classes was taking more time out of the class period that I anticipated. I needed to make sure that I was allowing enough time in the class period for these questions, which meant streamlining my presentation or discussion of certain topics. I was mindful of this as the semester proceeded, and I will continue to make adjustments to the timing of my classes as I continue to use clickers and incorporate clicker-based questions.

Another area for improvement pertains to student performance on particular course topics. Performance on some exam questions was weaker during the semester that I implemented clickers as compared to the prior semester. Going forward, I will pay particular attention to my coverage of those topics during class in order to make sure that students understand the material.

Finally, in future semesters when I use clickers I will more thoroughly evaluate student reactions to clickers at the end of the semester. The formal course evaluation mentioned above did not include questions specific to clickers, but rather presented two general open-ended questions about the course as a whole. Although Instructional Technology Services at SDSU has a set of survey questions that can be used to collect student feedback about the use of clickers, I did not realize that I needed to ask ITS to give me the questions. As a result, I did not end up obtaining the questions until after the conclusion of the semester, and I received them in a Power Point file which was not amenable to asking for student feedback remotely. In the future, I will take class time before the end of the semester to administer these questions to students in order to more extensively evaluate their perceptions of clickers.