News — Something about Rowan University engineering professors and students just seems to click. Literally.
Working on the National Science Foundation-funded project "Adaptation of Concept Inventories for Rapid Feedback and Peer-Assisted Learning in Core Engineering Courses," a team headed by Dr. John Chen, chair of Mechanical Engineering, has been exploring the effectiveness of devices similar in purpose to clickers as classroom tools.
What they have determined is that the electronic handheld devices facilitate learning.
Popular from K-12 to higher education, clickers communicate wirelessly with an instructor's laptop computer, which is used to project a multiple-choice question or quiz to a class. The computer tallies responses and provides a nearly instantaneous report to the professor in the form of a statistical graph.
"Clickers, along with helping make the classroom even more interactive, provide students and their teacher or professor with immediate feedback on their state of learning," said Chen, who worked on the study with mathematics associate professor Dr. Dexter Whittinghill and mechanical engineering associate professor Dr. Jennifer Kadlowec. "Providing rapid response to students improves their learning."
But, according to Chen, all evidence concerning this has been anecdotal. His team wanted to conduct the first systematic study to determine whether clickers or related devices improved learning.
Chen and company focused their study on a clicker sister device: wireless, networked handheld computers, a.k.a. personal digital assistants (PDAs).
During two years, they implemented the PDA feedback system in two sections of a lower-level mechanics course (Statics), all taught by Chen in order to eliminate differences in teaching style and to maintain the same pace and coverage of materials. One class used the PDAs, and the other class served as a control group, either using flashcards for feedback or receiving no feedback (these groups switched mid-semester).
In class, Chen presented a new topic or concept for no more than 10 to 15 minutes, using lectures, demonstrations or sample problem solutions. Afterwards, he posed a question using his computer to gauge the students' understanding, also projecting a set of solutions to the class. The correct solution was embedded with incorrect answers, also known as "distractors," which were derived from common student mistakes or misunderstandings. Students were given time to reflect and work on the question, discuss it with their peers and then submit an answer through their PDAs. The professor's computer collected the students' responses, and the tallied responses were displayed nearly instantaneously as a graph to provide feedback to the students and the professor.
If the student responses showed that a high percentage of students did not understand the concept or had not mastered the skill, Chen elaborated on the topic. If the responses showed that a reasonable number of students understood the topic, Chen directed the students to take time and explain the concept or skill to each other. Following that, he asked the students to either respond again to the same question or to a different question on the same topic. The final scenario occurred when the student responses showed a high percentage of the correct answers, indicating that students understood the topic; in this case, Chen simply continued to the next topic. In a typical, 75-minute class period, he administered three to five quizzes.
Student performance on a quiz at the end of each treatment period provided the data for comparison between the two groups. A general linear statistical model was used to analyze the treatment factor while controlling for other factors, such as the students' grades in prerequisite courses.
Findings in 2004, during which Chen used either the PDAs or flashcards for feedback, showed that there was no significant difference between student performances in the two sections. "It did not matter how one provided feedback to the students, so long as it was provided," Chen said.
Student survey results indicated that 100 percent of the students felt that either method of feedback was at least "somewhat helpful" to their learning, with a significant preference for the PDAs over the flashcards.
Findings in 2005, in which the team compared the PDA-enabled system with receiving no feedback, found a statistically significant and positive effect when students received feedback.
"This was a noteworthy finding that confirms the value of providing rapid feedback to students in order to improve learning," Chen said. "I used collaborative-learning techniques that are established to be beneficial to student learning for all classes. This tells us that there can be learning improvements by providing rapid feedback even when alternative active-learning methods are used."