Challenge: Timely Feedback for Every Level of the Education System
Stakeholders: Developers, Corporate Decision Makers, Teachers, Students
Tech Affordances: Immediate feedback from end users at scale; Longitudinal measures
Methods: Controlled trials of new approaches
Author: Barbara Means based on a presentation by Bror Saxberg for the February 12 convening of the Next Generation Learning Challenges Wave 1 Grantees, Austin, TX.
The Challenge: Kaplan Inc. has grown from a small test preparation business to a major provider of online learning. Bror Saxberg, Kaplan’s Chief Learning Officer, points out that most commercial providers of services for K-12 and higher education face a “double-buyer situation” since they need to satisfy two kinds of stakeholders. Ultimately, their services need to fit the needs and the preferences of the learner, but in most cases some other entity such as a school district or a college department actually makes the decision to buy the product or service. In contrast, because most of Kaplan’s sales come from adult students, their learners and their customers are the same. Kaplan finds that their customers care not just about traditional academic measures, such as course grades and credits, but about longer-term practical consequences such as whether or not what they learned online will enable them to get and keep a job.
Saxberg explains that this situation motivates Kaplan to look at multiple outcomes. Among the most important of these is whether or not a student chooses to enroll in the next online course in the sequence. Students who feel they are not learning, or that the online program is not using their time efficiently, will go elsewhere.
Saxberg’s group at Kaplan has been engaged in redesigning some of the company’s materials using principles from learning research to demonstrate the value of these ideas. The group wanted to ascertain whether such redesigned courses are producing better outcomes for learners in order to know whether the principles that work in published laboratory settings show benefits in practice, both for learning and for retention (which helps drive business performance). The fact that the courses are online and that the learning management system creates a persistent record for each learner/customer as they work through the courses make it possible for Saxberg’s group to do both.
The Approach: The group selected several existing courses —on topics such as Nutrition, Interpersonal Communication, and Medical Terminology— to put through a redesign process. The old courses had generally asked students to follow a classic online learning sequence of “Read, Write, Discuss.” Saxberg’s group sought to make the courses more active by changing the design to one characterized by “Prepare, Practice, Perform.” The new courses were created on an open platform based on Moodle and care was taken to make them easy to use and give them a clean, simple look with good production values, paying attention to research (e.g. as summarized in Clark and Mayer’s E-Learning and the Science of Instruction) on how media, audio, and text best reinforce, as opposed to distract from, learning. The redesigned versions offered opportunities for students to get help when they need it, as well as built-in assessments and quick surveys of self-efficacy and perceived value, and provided much more structured support for faculty as well.
In addition to various outcome metrics, the system was able to track how much time students spent on each part of the course, and also tracked answers to periodic questions about students’ motivation state during the course. This information, combining results on learning performance with survey questions, provided suggestions to faculty members for which students should be contacted, with an editable e-mail template to help bring students back on track.
Saxberg argues that it is very important to use multiple measures or metrics to track learning outcomes. A single outcome measure can be misleading For example, grading on final exams may vary as instructors become more lenient in their grading practices over time, or may be contaminated if some of the exam content has been leaked. Focusing on student satisfaction for a course may miss the fact that, although students found a course hard work, they actually learned more. For the redesigned courses, Saxberg’s group looked at instructor satisfaction, student satisfaction, performance on embedded learning assessments, whether the student passed the course, and whether the student was retained until the next semester. (A certain subset of the assessment were kept exactly identical between original and new courses, including rubrics, to enable comparison between old and new courses using the same assessment instruments.)
Because Kaplan courses start every month and have large numbers of students, Saxberg’s group were able to run parallel sections of courses, the new courses and the old versions, during the same calendar periods. They ran the experiment twice, with a total of about a thousand students split approximately half –half between experimental and control groups. Various characteristics of the students and faculty were also available to model differences between the sections.
The Results: The group found that instructors preferred the redesigned courses, and students did better on the embedded assessments in the redesigned courses, spent more time with the course materials, were more likely to pass, and were more likely to take the next course. But when it came to asking students how much they “liked” their new online courses, students taking the old versions of the courses gave more positive ratings. To Saxberg, this last finding points out the need to distinguish between how much students like something, which is often a function of how easy and entertaining it is, and how valuable they think it is to them. In his mind, the fact that students taking the new versions of the courses were more likely to take the next course, spent more time working at the new courses, and seemed to learn more, demonstrates that the effort required to redesign a course using learning science principles is worthwhile. Student comments about new and old courses suggested students recognized the new courses were considerably more work, which may have contributed both to lower “liking,” and greater success – and the fact that students persisted at higher rates to the next course in sequence reinforces the idea that they valued the new courses, even if they did not “like” them as well.
From the increased retention of students in the program, Saxberg can demonstrate the net value of redesigning courses in monetary terms. The effort required an average of around $375,000 to redesign the three courses (each course representing about 150 hours of student work), and produced many times that amount in additional income from the higher percentage of students who choose to take the next course and the expected longevity increase of students who passed the course.
Value of this Approach:
For developers, both academic and affective information collected automatically by the system offered immediate data on how well the new designs were working, and the large database allowed them to compare affective and cognitive outcomes for the new versions of the course to those for older versions, while controlling for variables such as student demographics and instructor characteristics. For corporate decision-makers, the data tracked online provided actionable input to strategic decisions.
For teachers/instructors, the reports based on information collected by the system allowed them to better understand their students’ learning behaviors and better offer them tailored support. For students, the new program gave them opportunities that helped them become more engaged and successful learners by providing them more opportunities to practice, and more structured and systematic feedback on their learning performance.