Interpreting Knewton’s 2017 Student Mastery Results
Andrew D. Jones | March 15, 2018
This post was developed with Illya Bomash, Knewton’s Managing Data Scientist.
Results. Efficacy. Outcomes.
Student success is the ultimate goal of learning technology. Despite this, there exists a startling lack of credible data available to instructors and administrators that speaks to the impact of ed-tech on learning and academic performance.
To provide instructors and administrators with greater transparency into the effectiveness of alta and the Knewton adaptive technology that powers it, we analyzed the platform results of students using alta. These results represent our effort to validate our measure of mastery (more on that to come) and provide instructors and administrators with much-needed transparency regarding the impact of alta on student achievement.
Here, we hope to provide context and explanation that we hope will leave educators and those in the ed-tech community with a clearer picture of how we arrived at the these results — and why they matter.
Our data set
The findings in this report are drawn from the results of 11,586 students who cumulatively completed more than 130,000 assignments and 17,000 quizzes in alta in 2017.
This data set includes all of alta’s 2017 spring and summer student interactions. Only cases in which the relevant calculations are impossible have been excluded — such as quiz scores for a course in which the instructor chose not to administer quizzes. So while these results aren’t from randomized, controlled trials, they do paint an accurate portrait of student performance across alta users, making use of as much of our student data as possible.
Our adaptive technology is based on the premise that if a student masters the concepts tied to the learning objectives of their course, that student will succeed in the course and be prepared to succeed in future courses. It’s also based on the premise that Knewton’s mathematical model of student knowledge states — which we frequently refer to as Knewton’s proficiency model — can determine when a student has reached mastery.
This basis in mastery manifests itself in how students experience alta: Every assignment that a student encounters in alta is tied to learning objectives that have been selected by the instructor for their course. A student “completes” an alta assignment when our proficiency model calculates that a student has mastered all of the learning objectives covered in that assignment.
Our 2017 Mastery Results seek to clarify two things: the frequency with which students achieve mastery in alta, and the later performance of students who have (and have not) achieved mastery, as determined by our proficiency model.
Controlling for students’ initial ability level
In this analysis, we wanted to assess the impact of mastery across the full spectrum of student ability levels. To capture a sense of each student’s initial proficiency, we aggregated the first two questions each student answered across all of the concepts he or she encountered in the course. The percentage of those questions the student answered correctly provides a naive but reasonable estimate of how well the student knew the material entering the course.
We looked at the distribution of this score across all of our students, tagging each student’s history with a label corresponding to where they fell among all users.
- Struggling: Students whose initial ability fell into the bottom 25% of our population
- Average: Students whose initial ability fell into the middle 50% of our population
- Advanced: Students whose initial ability fell into the top 25% of our population
Note: Knewton’s proficiency model neither uses this measure nor tags students with any kind of “ability label.” Our adaptive technology calculates a detailed, individualized portrait of each student’s proficiency levels across a wide range of concepts after each student interaction. But for the sake of this comparative impact analysis, we’ve chosen to use these distinctions as a tool to compare students of similar initial abilities.
Students of all ability levels achieved mastery with alta at high rates
Analyzing students’ assignment completion revealed that with alta, students achieve mastery at high rates. As seen in Figure 1, across all students, 87% of the time, students working on an assignment in alta achieved mastery. Even among students who struggled to complete a particular assignment, 82% eventually reached mastery.
Achieving mastery with alta makes a positive impact on students’ academic performance
We know that with alta, students are highly likely to achieve mastery. But what is the impact of that mastery? When our model indicates that a student has mastered the material, how well does the student perform on future assignments, quizzes, and tests?
For any given level of initial ability, Knewton’s adaptive learning technology is designed to facilitate reaching mastery effectively for any student willing to put in the time and effort. To validate Knewton’s measure of mastery, we compared the performance of students who mastered prerequisite learning objectives (for adaptive assignments) and target learning objectives (for quizzes) through alta with students of similar initial ability who did not master these concepts.
Mastery improves the quiz scores for students of all ability levels
Figure 2 shows average Knewton quiz scores for students who did/did not reach mastery of the quiz learning objectives on prior adaptive assignments. Quiz takers who mastered at least ¾ of the quiz learning objectives through previous adaptive work went on to achieve substantially higher quiz scores than similarly-skilled peers mastering ¼ or fewer of the learning objectives.
Mastery levels the playing field for struggling students
Putting in the work to reach mastery on the relevant adaptive assignments increased initially struggling students’ average quiz scores by 38 percentage points, boosting scores for these students above the scores of otherwise advanced students who skipped the adaptive work.
Mastery improves future platform performance
Students who master the learning objectives on earlier assignments also tend to perform better on later, more advanced assignments.
As Figure 3 shows, controlling for overall student skill levels, students who mastered ¾ of the learning objectives prerequisite to any given assignment tended to complete the assignment at much higher rates than students who did not. This is the virtuous cycle of mastery: the more students master, the better prepared they are for future learning.
Work to completion
Mastery of an assignment’s learning objectives also saves students time. When students began an assignment after having mastered most of its prerequisites, they tended to require significantly fewer questions to complete it. For students who mastered at least ¾ of the prerequisites to any given adaptive assignment, completing the assignment took 30-45% fewer questions than for students who did not (see Figure 3). Mastery helps students of all abilities learn faster, and struggling students see the biggest gains: for these students, prerequisite mastery leads to an average postrequisite assignment shortening by more than 40%.
The road ahead
Any self-reported efficacy results will be met with a certain amount of scrutiny. While we’ve attempted to be as transparent as we can be about our data, we understand that some will question the validity of our data or our approach to presenting it.
It’s our hope that, if nothing else, the reporting of our results will inspire others in the ed-tech community to present their own with the same spirit of transparency. In many ways, these results are intended not as a definitive end-point but as the start of a more productive conversation about the impact of technology on learning outcomes.
Lastly, while our 2017 Student Mastery Results are encouraging, we know that they exist in a world that is constantly changing. The challenges in higher education are becoming greater and more complex. The student population is growing increasingly diverse. Our technology and our approach to learning is evolving.
This year, we plan to update these numbers periodically and provide the results of other analyses with the goal of providing greater transparency into the effectiveness of alta and deeper insights into how students learn.