Adaptive Learning Predictive Analytics: Two Domains
Concurrent Session 3
We present results of our collaborative research comparing the efficacy of two possible predictive analytics domains using data inherent in an adaptive learning platform.
Predictive analytics in higher education holds the promise of identifying at risk students early. However, a persisting problem arises when explaining predictions in a manner that leads to effective student interventions. There is an idealized natural paradigm that would make predictive analytics contextually useful: Data to Information to Insight to Action
The Action component demands that we identify and validate effective interventions for wide rages of student demographics or, at a minimum, identify the contexts in which they work. An example might be that an email to a student that they should be engaging with their online course more frequently might be effective for a higher achieving student, while simultaneous being detrimental to a student experiencing the scarcity phenomenon from living in poverty. This kind of interaction exemplifies complexity making cause and effect difficult to identify. However, there have been some developments in higher education that offer promise for implementing more effective predictive analytic models including adaptive learning technologies.
Adaptive learning (AL) facilitates the creation of responsive learning environments that allow students to accelerate or extend their studies, thereby challenging the usual completion constraints. John Carroll (1963) identified this when he demonstrated that if learning time is held constant, then knowledge, skill and concept acquisition will vary. But, if the constant is some pre-specified level of achievement, learning time will become the variable. In the vernacular of higher education, if you give all students one semester to learn College Algebra, there will be important differences in the knowledge they each acquire.
Without support and effective technology, implementing adaptive learning can be challenging. Fortunately, that help is available in a number of good functioning adaptive platforms that can:
1. Personalize the learning trajectory,
2. Customize content, and
3. Assess student progress in real time (Creative Distruction. 2018).
Dziuban, et.al. (2019) listed a number of issues that underlie those three components including:
1. What role does social learning play?
2. What cognitive parameters are involved?
3. How do students behave in the adaptive environment?
4. Can adaptive learning be scaled?
5. What is adaptive learning’s impact on access to education?
6. How do students perceive this learning structure?
Examining these elements individually will underrepresent adaptive learning. This list is not complete, because of the complex nature of varying approaches to higher education.
Taleb (2018) put it this way:
“The main idea behind complex systems is that the ensemble behaves in ways not predicted by its components. The interactions matter more that the nature of the units. Studying individual ants will almost never give us a clear indication of how an ant colony operates. For that, one needs to understand an ant colony as an ant colony, no less, no more, not a collection of ants. This is called the emergent property of the whole by which parts and whole differ because what matters are the interaction between such parts. And interactions can obey very simple rules.”
Forrester (1993) described three principles when operationalizing an initiative:
1. The impossibility of anticipating how an intervention will ripple through a complex system
2. Often, outcomes will be counterintuitive, and
3. There will be unanticipated side effects.
Findings over the Years
The researchers have been collaboratively evaluating adaptive learning for over 4 years. To date, the investigators have found that:
1. Students adjust to the adaptive learning environment seamlessly
2. Student reaction to adaptive learning is almost universally positive in our large public and for-profit university settings.
3. Adaptive learning stabilizes the learning environment across diverse universities and course disciplines,
4. There are considerable differences in how students relate to the underlying dimension of adaptive learning.
5. Early results suggest that the concept of adaptive predictive analytics has potential for a real-time intervention model.
This next phase of research addresses the ability of adaptive learning technology to predict student success in courses.
In this presentation the panelists will review the status of their current research then present their analyses comparing the efficacy of two possible predictive analytics domains using data inherent in an adaptive learning platform. Adaptive learning course data from a large public and for-profit university engaged with adaptive learning will be used. The represented domains include the following student data elements captured by Realizeit, but mimic those analytics often examined through students’ progression through and performance on the course content.
1. Surrogate learning management system (LMS) variables that reflect students’ engagement with the adaptive content, and
2. Surrogate student information system (SIS) variables that reflect students’ performance on the adaptive content.
The variables representing those constructs are:
LMS Surrogate (Engagement) Variables
• Knowledge Covered Growth (KCG): The extent by which a student’s KC has changed from the start of the course. Can be positive or zero.
• Interactions (IN): The engagement level of the instructor(s) with the student. The total number of interactions.
• Messages Sent (MS): The number of the interactions sent by the instructor that were simple messages.
• Total Activities (TA): The total number of non-assessment activities started by the student.
• Total Time (TT): The total time spent on non-assessment activities started by the student.
• Number Revised (NR): The total number of node-level activities that are classified as revision.
• Number Practiced (NP): The total number of objective-level practice activities.
SIS Surrogate (Achievement) Variables
• Knowledge State (KS): A measure of student ability. The mean level of mastery that the students have shown on topics they have studied.
• Knowledge Covered (KC): A measure of student progress. The mean completion state of each of the course objectives.
• Calculated (CA): An institution-defined combination of several metrics, mainly KS and KC, used to assign a grade to students.
• Average Score (AS): The mean result across all learning, revision, practice, and assessment activities.
• Determine Knowledge (DK): The percentage objectives on which the student completed a Determine Knowledge operation.
• Knowledge State Growth (KSG): The extent by which a student’s KS has changed from the start of the course. Can be positive, negative, or zero.
Initial work shows the squared multiple correlations among the LMS and achievement data are substantially independent. For instance, the average correlation among the LMS engagement variables was .15 and the reliability was .55. The achievement variables produced considerably different values with an average correlation of .80 and a reliability coefficient of .90. The correlation between the constructed measures for each domain was .11, accounting for 2 percent of the shared variance. This raises the intriguing possibility that the two domains are measuring entirely different constructs and may perform differently for predicting student success.
Additional ongoing work by one of the institutions on this panel, comparing similar domains outside the adaptive environment has found persuasive evidence that student information data such as high school and college grade point averages are able to predict student success in stem course with 80% accuracy in the first week of the semester. LMS data will increase that accuracy to 85% by week six, barely a marginal improvement. However, there were very few LMS variables available for study. Unfortunately although the predictive model was effective there are limited interventions that can compensate for student habits that might result in low GPAs. We find ourselves back to the problem of great prediction with few, if any, intervention strategies. However, the current study has a much more comprehensive and definitive (albeit surrogate) domain of LMS measures, captured in real-time as students progress through an adaptive system.
This panel will present the efficacy of the two domains for predicting student success in introductory and college algebra with the indices produced by the adaptive platform. The study has an additional dimension in that it represents an organic A-B study of success in one situation where student success is completely determined within the platform and another context where achievement is determined by external department criteria.
Face-to-face interaction has always presented a number of constraining issues: time, audience size, room arrangement and participant motivation for attending. However, this panel has been successful at follow-up participation with prior session participants, providing further information that has led to a number of co-presentations at scientific meetings and publications in multiple professional journals. We intend to continue this practice though individual contacts, social media and campus visits upon the conference completion. We have found this approach to be quite effective. Audience Q&A will be interactive, but primarily we hope to enlist collaborative research partners for future research as it is strongly needed to inform adaptive learning implementation.
Carroll, J. B. (1963). A model of school learning. Teachers college record. 64(8), 723-723.
Dziuban, C., Howlin, C., Moskal, P., Johnson, C., Eid, M., & Kmetz, B. (2018). Adaptive Learning: Context and Complexity. e-mentor, (5), 13-23.
Creative Destruction (October, 4, 2018), Economist. Retrieved from https://www.economist.com/leaders/2014/06/28/creative-destruction.
Forrester, J. W. (1993). System dynamics and the lessons of 35 years. In A systems-based approach to policymaking (pp. 199-240). Norwall, MA: Kluwer Academic.
Taleb, N. N. (2018). Skin in the game: Hidden asymmetries in daily life. New York: Random House.