Helping students learn from online self-paced tools

Title: Holistic Framework to help students learn effectively from research-validated self-paced learning tools

Authors: Emily Marshman, Seth DeVore, Chandralekha Singh

First author’s institution: Community College of Allegheny County

Journal: Physical Review Physics Education Research 16, 020108 (2020)


With many universities moving to hybrid or asynchronous classes for the fall semester, students will be spending more time than before learning outside of the classroom. A natural concern is whether students will be learning as much as before.

While the researchers conducted today’s study in pre-pandemic times, the conclusion is just as relevant today: students might not be using online learning tools effectively to gain the intended benefits. Yet, it isn’t a hopeless situation. The researchers developed a framework for instructors to help students achieve the desired results.

To reach their conclusion, the researchers studied the effects of a series of online tutorials they had created. After the relevant instruction during their introductory physics course, 385 health profession majors had access to a series of online tutorials to help them better understand the concepts presented in class. The tutorials were designed to help students work through a typical physics problem they might see on an exam.

For example, when solving a mechanics problem (figure 1), the tutorial walked students through each step of the solution and asked a series of multiple choice questions (figure 2) along the way. These questions helped the student reach the solution and provided feedback on each step, both when the student answered correctly and incorrectly.

Physics problem describing three masses connected by strings. If a force is applied to one of the blocks, is the acceleration of the middle block and what are the tensions in the string?
Figure 1: One of the problems from a mechanics tutorial. (Fig 1 in paper)
Figure 2: One of the multiple choice questions the student sees when trying to answer the overall question (left panel). The middle panel shows an example of the feedback when the student answers incorrectly and the right panel shows an example of the feedback when a student answers correctly. (Fig 2 in paper).

During the next recitation period, all the students took a two-question quiz. The first question, which the researchers called the pre-quiz, was identical to one of the problems from the tutorial. To measure the benefit of the tutorial compared to typical scaffolding (such as breaking the problem into a series of subproblems on the quiz), students were randomly assigned to a either a scaffold version or the regular version of the pre-quiz problem. The students indicated whether they had used the tutorials on the pre-quiz so the researchers didn’t know ahead of time whether a students had tried the tutorial.

After the students completed their pre-quiz problem, they all completed the same, non-scaffolded problem that was conceptually similar to the first problem but had not appeared on the tutorial. Thus, this was a transfer problem to see if the students could apply the same concepts to a new problem.

As we might expect, those who did the tutorial answered the pre-quiz problem correctly more often than students who hadn’t done the tutorial (figure 3). After all, they had already solved the problem when they did the tutorial.

Figure 3: Comparison of correct response rates for each topic on the pre-quiz problem and the transfer (paired) problem. Scores are split by scaffolded or unscaffolded and colored by tutorial or non-tutorial. (Fig 4 in paper)

Likewise, those who solved a scaffolded pre-quiz problem answered it correctly more often than those who solved the regular version. Students who had completed the tutorial and solved a scaffolded pre-quiz problem performed best.

Yet, when it came to the transfer problem, the benefits didn’t combine. That is, students who did the tutorial and solved the scaffolded pre-quiz problem correctly answered the transfer problem at a similar rate to students who only solved the scaffolded pre-quiz problem or completed the tutorial.

Now, you might be wondering, what’s going on here? The tutorials offered the same level of scaffolding as the pre-quiz problem plus feedback so why didn’t these students perform the best on the transfer problem?

To answer that, the researchers looked at their preliminary work. Before using the tutorials in the course, the researchers tested them with a group of 22 students during a previous semester. These 22 students, who were representative of students in the overall course, answered the transfer question correctly 80% of the time. Yet, students in the course who used the tutorial answered the transfer question correctly 50% of the time. The only difference was that the researchers observed the 22 students while they worked through the tutorial and had the students talk through what they were doing (though the researcher did not respond or provide feedback on what the students were saying).

From this preliminary work, the authors concluded that the students who used the tutorials on their own were not using them effectively. Maybe they were just randomly clicking through them or guessing for each of the answers instead of trying the problem first like the tutorial instructed. Perhaps the students didn’t see what they were supposed to learn from the tutorials.

In response, the researchers created a framework to understand these out of class resources.

The framework (shown in figure 4) allows the instructor to think about the resource on the a tool and user level and an internal and external level. The internal level consists of how the tool is designed (so how the researchers created their tutorial) and what students know or believe before using the tool. The instructor might not have too much control over these, especially if they are using a tool designed by someone else.

Figure 4: Framework for promoting outside the classroom learning. (Fig 5 in paper)

However, the instructor can have a large influence on the external factors such as how students view the tool or why the students would want to use the tool. Therefore, the researchers created a list of recommendations to instructors to help their students use the tools effectively:

  • Make sure to explain why students should use and take the tool seriously. Will it help them do better on a quiz or exam? Is there participation credit or a grade attached?
  • Consider accountability measures for the tool. Will students be using what they learned from the tool to solve problems in a group during recitation? Students might be more likely to take the tool seriously if they know their group members are expecting them to have used the tool.
  • Make sure students believe they can learn the material. Physics has a reputation of requiring brilliance to be successful. If students believe that, why would they try? Instead, emphasize that like learning any other skill, learning physics requires practice rather than being “smart.”

In addition, students may be having a hard time balancing their other responsibilities (who isn’t right now?) and not sure when they have time to use the tool. Including estimates of how much time the activity will take and suggesting students schedule a time to do the activity might be helpful.

So while we might not be able to ensure that our students are getting the intended learning, our actions play a role in offering the students the best chance to do so.

Figures used under CC BY 4.0. Header image by StartupStockPhotos from Pixabay

Leave a Reply

Your email address will not be published. Required fields are marked *