Authors: Jason W. Morphew, Gary E. Gladding, and Jose P. Mestre
First author’s institution: Purdue University
Journal: Physical Review Physics Education Research, 16 010104 (2020)
Solving problems is a key part of learning physics. However, physics problems are often difficult for novice learners and they can easily get stuck trying to solve the problem. One method instructors can employ to help students is create solution videos. In solution videos, the instructor records themselves solving problems, explaining their work as they go. Students can then watch these videos at their convenience, which can be especially useful if they are not able to regularly attend office hours.
While research has shown that worked examples and solution videos can be very useful for student learning, less work has examined how the format of the videos may affect student learning and their perceptions of their abilities. For example. perhaps solution videos give students a false sense of confidence of their problem solving abilities. Through a series of three studies over multiple semesters, today’s paper explores these questions.
Study 1
In this first study, the authors of today’s paper wanted to see how the format of the solution videos affected student learning. 93 students from an introductory, calculus-based physics course volunteered to participate in this study and were split into two groups. Each group began by taking a 5-item pre-test over quantitative conservation of energy, center of mass, and conservation of momentum problems. The students worked out each solution and then entered their answer into a computer, receiving feedback if they were correct or not. Each of the groups then watched a series of animated, narrated solution videos that covered each of the 5 problems on the pre-test.
The first group watched a solution video in what the authors referred to as the “self-reflection style.” The instructor narrated their thought process as they construct a solution from “first principles” such as the law of conservation of energy. The instructor would state the principle before the relevant equation would appear on the screen. The equations remained on the screen throughout the entire video so that students wouldn’t need to rewind the video to see the previous steps and to reduce their cognitive load. In addition, all of the algebraic steps were not shown, meaning students only saw the final manipulated equations at each step. (Fig 1).
The second group on the other hand saw what the authors referred to as the “two column style.” In this approach, the video begins with an animation of the physics scenario followed by a narration of the relevant physics concepts and principles. The left side of the screen is devoted to listing the principles while the right side of the screen is devoted to the equations. In this approach, all of the algebraic steps are done explicitly. (Figure 2)
After watching their videos, the two groups of students took the same post-test, which consisted of 5 quantitative problems similar to those on the pre-test and 13 multiple choice conceptual problems. The following week, the students returned and took the same post-test again though the problems were presented in a new order.
Finally, on each test, students were asked to rank how confident they were in their answer on a scale of 0%, 25%, 50%, 75%, or 100%.. The authors then used these confidence ranks to determine whether students were over- or under-predicting their actual performance.
When the authors looked at the data, they found that there was no statistical difference between the two groups on either of the two-post tests. Both groups of students increased their pre-test scores by an average of an additional one and a half quantitative problems (pretest average score 35% to a post test average score of 65%; figure 3).
When it came to confidence in their answers, the authors found that students were more accurately assessing their ability to answer the quantitative problems after watching the videos and there was no difference based on which style of videos the student watched. However, both groups of students overestimated their scores on the conceptual questions by between 15-20%.
Study 2
While the previous study suggests that the details of the solution video don’t seem to matter (such as working out the algebra vs just showing the result), the first study did not establish that the solution videos actually caused the students to do better on the post-test than the pre-test. For example, what if receiving feedback on the pre-test answers contributed to the increased post-tests and more accurate predictions of scores?
To see if that was the case, the authors recruited 70 students for a follow up study. For this study, the students were again split into two groups: one group that took the pre-test and received feedback and one group that did not take the pre-test. Both groups then watched a series of solution videos of the problems on the pre-test (all in the two column style) and completed a single post-test afterward (9 quantitative problems and 8 multiple choice conceptual problems) as well as predicted how well they did on the post-test. Unlike the first study, topics addressed in this study were the traditionally more challenging topics of rotational motion, angular kinematics, and angular momentum.
For the students who completed the pre- and post-tests, similar improvements were seen as in the first study. However, students from both groups performed equally well on the post-tests, suggesting that the solution videos were in fact responsible for the improvement. As in the previous study, both groups of students overestimated their ability to answer the conceptual questions by 20%. Interestingly, only the students who had taken the pre-test significantly overestimated their ability to answer the questions on the post-test; students who did not take the pre-test estimated their post-test performance within a few percent.
Study 3
As the students in study 2 did worse on the post-tests than students in study 1 did, the authors wondered if the difficulty of the concepts affected the results. For example, maybe the problems were too difficult for the students and hence, students who did the pretest couldn’t make progress on the problems. If the concepts were easier, such as in study 1, maybe trying the problems beforehand would offer a benefit.
The authors repeated study 2 during a later semester with a new set of 50 students. The study design was identical to study 2 except for the number of questions on the post-test and the concepts covered.
When looking at the results, the authors found that students who did the pre-test first did do slightly better on the quantitative problems but the result was not statistically significant. However, the students were more accurate in predicting their scores, with both groups only overestimating their scores by a few percent.
Takeaways
Through in this study, solution videos did improve students’ problem solving abilities and the presentation style did not matter. While trying the problem before watching the video may offer benefits for problems near the student’s current ability, this was not the case for more difficult problems and may cause students to overestimate their ability.
Across all three studies, students tended to overestimate their problem-solving abilities. As instructors often create solution videos as exam review, students watching the videos may think they are more prepared than they actually are. This is especially true for conceptual problems where student significantly overestimated their ability. The authors suggest that instructors include conceptual problems in solution videos as well to combat this.
If you are interested in seeing the solution videos, they are available at this Dropbox link.
Figures used under CC-BY 4.0. Header image from Khan Academy used under CC BY NC SA 2.0.
I am a postdoc in education data science at the University of Michigan and the founder of PERbites. I’m interested in applying data science techniques to analyze educational datasets and improve higher education for all students
Thank you for this information! I will try this especially I am teaching future science teachers!