Adaptive Learning Predictive Analytics: Two Domains

Concurrent Session 3

Add to My Schedule

Brief Abstract

We present results of our collaborative research comparing the efficacy of two possible predictive analytics domains using data inherent in an adaptive learning platform.

Presenters

Charles Dziuban is Director of the Research Initiative for Teaching Effectiveness at the University of Central Florida (UCF) where has been a faculty member since 1970 teaching research design and statistics and is the founding director of the university’s Faculty Center for Teaching and Learning. He received his Ph.D. from the University of Wisconsin. Since 1996, he has directed the impact evaluation of UCF’s distributed learning initiative examining student and faculty outcomes as well as gauging the impact of online, blended and lecture capture courses on the university. Chuck has published in numerous journals including Multivariate Behavioral Research, The Psychological Bulletin, Educational and Psychological Measurement, the American Education Research Journal, the Phi Delta Kappan, the Internet in Higher Education, the Journal of Asynchronous Learning Networks, and the Sloan-C View. His methods for determining psychometric adequacy have been featured in both the SPSS and the SAS packages. He has received funding from several government and industrial agencies including the Ford Foundation, Centers for Disease Control, National Science Foundation and the Alfred P. Sloan Foundation. In 2000, Chuck was named UCF’s first ever Pegasus Professor for extraordinary research, teaching, and service and in 2005 received the honor of Professor Emeritus. In 2005, he received the Sloan Consortium award for Most Outstanding Achievement in Online Learning by an Individual. In 2007 he was appointed to the National Information and Communication Technology (ICT) Literacy Policy Council. In 2010, Chuck was named an inaugural Sloan-C Fellow. In 2012 the University of Central Florida initiated the Chuck D. Dziuban Award for Excellence in Online Teaching for UCF faculty members in honor of Chuck’s impact on the field of online teaching and learning. In 2017 Chuck received UCF’s inaugural Collective Excellence award for his work strengthening the university’s impact with the Tangelo Park Program and assumed the position of University Representative to the Rosen Foundation Tangelo Park and Parramore programs.
Patsy Moskal is the Director of the Digital Learning Impact Evaluation in the Research Initiative for Teaching Effectiveness at the University of Central Florida (UCF) where she evaluates the impact of technology-enhanced learning and serves as the liaison for faculty scholarship of teaching and learning. In 2011 Dr. Moskal was named an OLC Fellow in recognition of her groundbreaking work in the assessment of the impact and efficacy of online and blended learning. She has written and co-authored numerous works on blended and online learning and is a frequent presenter on these topics. Patsy's co-authored book--Conducting Research in Online and Blended Learning: New Pedagogical Frontiers--with Dziuban, Picciano, and Graham, was published in August 2015. She currently serves on the OLC Board of Directors.
Dr. Connie Johnson is Colorado Technical University's (CTU) chief academic officer and provost, working with both online and ground degree programs. She has oversight of academic affairs, including faculty, curriculum, classroom experience, and accreditation. During her time at CTU, Connie has initiated adaptive learning technology implementation, effective leadership of academics, women's leadership, leading academics through change, and effective technology implementation in the online classroom including the promotion of academics, faculty and student engagement through social media. Connie has served in higher education for over 20 years with extensive experience in online and ground teaching, administration, and leadership. Additionally, Connie has extensive experience in regional accreditation, curriculum implementation, and faculty training and development. She is a trained peer evaluator for the Higher Learning Commission (HLC), has completed and served as a facilitator in the ACE Chief Academic Officer Institute, and is a member of the CTU Board of Trustees. Her educational background includes a Doctorate of Education, organizational leadership emphasis (2010), and a Master of Business Administration in management (1991) from Nova Southeastern University; and a Bachelor of Science with honors in criminal justice from Florida State University.
Dr. Colm Howlin is the Principal Researcher at Realizeit and leads the research and analytics team. He has been with the company since it was founded 8 years ago. He is responsible for the development of the Adaptive Learning Engine within Realizeit and the Learning and Academic Analytics derived from learner data. Colm has a background in Applied Mathematics earning his B.Sc. and Ph.D. in Applied Mathematics from the University of Limerick and was a Research Fellow at Loughborough University in the UK. Colm has over 10 years’ experience working on research, educational data, analytics and statistical analysis, including spending time as a Consultant Statistician before joining Realizeit.

Extended Abstract

Predictive analytics in higher education holds the promise of identifying at risk students early. However, a persisting problem arises when explaining predictions in a manner that leads to effective student interventions. There is an idealized natural paradigm that would make predictive analytics contextually useful: Data to Information to Insight to Action
 
The Action component demands that we identify and validate effective interventions for wide rages of student demographics or, at a minimum, identify the contexts in which they work. An example might be that an email to a student that they should be engaging with their online course more frequently might be effective for a higher achieving student, while simultaneous being detrimental to a student experiencing the scarcity phenomenon from living in poverty. This kind of interaction exemplifies complexity making cause and effect difficult to identify. However, there have been some developments in higher education that offer promise for implementing more effective predictive analytic models including adaptive learning technologies. 
 
Adaptive Learning
 
Adaptive learning (AL) facilitates the creation of responsive learning environments that allow students to accelerate or extend their studies, thereby challenging the usual completion constraints. John Carroll (1963) identified this when he demonstrated that if learning time is held constant, then knowledge, skill and concept acquisition will vary. But, if the constant is some pre-specified level of achievement, learning time will become the variable. In the vernacular of higher education, if you give all students one semester to learn College Algebra, there will be important differences in the knowledge they each acquire. 
 
Without support and effective technology, implementing adaptive learning can be challenging. Fortunately, that help is available in a number of good functioning adaptive platforms that can:
1. Personalize the learning trajectory,
2. Customize content, and
3. Assess student progress in real time (Creative Distruction. 2018). 
 
Dziuban, et.al. (2019) listed a number of issues that underlie those three components including:
1. What role does social learning play?
2. What cognitive parameters are involved?
3. How do students behave in the adaptive environment?
4. Can adaptive learning be scaled?
5. What is adaptive learning’s impact on access to education?
6. How do students perceive this learning structure?
 
Examining these elements individually will underrepresent adaptive learning. This list is not complete, because of the complex nature of varying approaches to higher education.
 
Taleb (2018) put it this way:
“The main idea behind complex systems is that the ensemble behaves in ways not predicted by its components. The interactions matter more that the nature of the units. Studying individual ants will almost never give us a clear indication of how an ant colony operates. For that, one needs to understand an ant colony as an ant colony, no less, no more, not a collection of ants. This is called the emergent property of the whole by which parts and whole differ because what matters are the interaction between such parts. And interactions can obey very simple rules.”
 
Forrester (1993) described three principles when operationalizing an initiative:
1. The impossibility of anticipating how an intervention will ripple through a complex system
2. Often, outcomes will be counterintuitive, and
3. There will be unanticipated side effects. 
 
Findings over the Years
 
The researchers have been collaboratively evaluating adaptive learning for over 4 years. To date, the investigators have found that:
 
1. Students adjust to the adaptive learning environment seamlessly
2. Student reaction to adaptive learning is almost universally positive in our large public and for-profit university settings.
3. Adaptive learning stabilizes the learning environment across diverse universities and course disciplines,
4. There are considerable differences in how students relate to the underlying dimension of adaptive learning. 
5. Early results suggest that the concept of adaptive predictive analytics has potential for a real-time intervention model. 
This next phase of research addresses the ability of adaptive learning technology to predict student success in courses.
 
Methods
 
In this presentation the panelists will review the status of their current research then present their analyses comparing the efficacy of two possible predictive analytics domains using data inherent in an adaptive learning platform. Adaptive learning course data from a large public and for-profit university engaged with adaptive learning will be used. The represented domains include the following student data elements captured by Realizeit, but mimic those analytics often examined through students’ progression through and performance on the course content.
 
1. Surrogate learning management system (LMS) variables that reflect students’ engagement with the adaptive content, and
2. Surrogate student information system (SIS) variables that reflect students’ performance on the adaptive content.
 
The variables representing those constructs are:
 
LMS Surrogate (Engagement) Variables
 
• Knowledge Covered Growth (KCG): The extent by which a student’s KC has changed from the start of the course. Can be positive or zero.
• Interactions (IN): The engagement level of the instructor(s) with the student. The total number of interactions.
• Messages Sent (MS): The number of the interactions sent by the instructor that were simple messages.
• Total Activities (TA): The total number of non-assessment activities started by the student.
• Total Time (TT): The total time spent on non-assessment activities started by the student.
• Number Revised (NR): The total number of node-level activities that are classified as revision.
• Number Practiced (NP): The total number of objective-level practice activities.
 
SIS Surrogate (Achievement) Variables
 
• Knowledge State (KS): A measure of student ability. The mean level of mastery that the students have shown on topics they have studied.
• Knowledge Covered (KC): A measure of student progress. The mean completion state of each of the course objectives.
• Calculated (CA): An institution-defined combination of several metrics, mainly KS and KC, used to assign a grade to students.
• Average Score (AS): The mean result across all learning, revision, practice, and assessment activities.
• Determine Knowledge (DK): The percentage objectives on which the student completed a Determine Knowledge operation.
• Knowledge State Growth (KSG): The extent by which a student’s KS has changed from the start of the course. Can be positive, negative, or zero.
 
Preliminary Findings
 
Initial work shows the squared multiple correlations among the LMS and achievement data are substantially independent. For instance, the average correlation among the LMS engagement variables was .15 and the reliability was .55. The achievement variables produced considerably different values with an average correlation of .80 and a reliability coefficient of .90. The correlation between the constructed measures for each domain was .11, accounting for 2 percent of the shared variance. This raises the intriguing possibility that the two domains are measuring entirely different constructs and may perform differently for predicting student success.
 
Additional ongoing work by one of the institutions on this panel, comparing similar domains outside the adaptive environment has found persuasive evidence that student information data such as high school and college grade point averages are able to predict student success in stem course with 80% accuracy in the first week of the semester. LMS data will increase that accuracy to 85% by week six, barely a marginal improvement. However, there were very few LMS variables available for study. Unfortunately although the predictive model was effective there are limited interventions that can compensate for student habits that might result in low GPAs. We find ourselves back to the problem of great prediction with few, if any, intervention strategies. However, the current study has a much more comprehensive and definitive (albeit surrogate) domain of LMS measures, captured in real-time as students progress through an adaptive system.
 
This panel will present the efficacy of the two domains for predicting student success in introductory and college algebra with the indices produced by the adaptive platform. The study has an additional dimension in that it represents an organic A-B study of success in one situation where student success is completely determined within the platform and another context where achievement is determined by external department criteria. 
 
Session Interaction
Face-to-face interaction has always presented a number of constraining issues: time, audience size, room arrangement and participant motivation for attending. However, this panel has been successful at follow-up participation with prior session participants, providing further information that has led to a number of co-presentations at scientific meetings and publications in multiple professional journals. We intend to continue this practice though individual contacts, social media and campus visits upon the conference completion. We have found this approach to be quite effective. Audience Q&A will be interactive, but primarily we hope to enlist collaborative research partners for future research as it is strongly needed to inform adaptive learning implementation.
 
References
Carroll, J. B. (1963). A model of school learning. Teachers college record. 64(8), 723-723.
Dziuban, C., Howlin, C., Moskal, P., Johnson, C., Eid, M., & Kmetz, B. (2018). Adaptive Learning: Context and Complexity. e-mentor, (5), 13-23.
Creative Destruction (October, 4, 2018), Economist. Retrieved from https://www.economist.com/leaders/2014/06/28/creative-destruction.
Forrester, J. W. (1993). System dynamics and the lessons of 35 years. In A systems-based approach to policymaking (pp. 199-240). Norwall, MA: Kluwer Academic.
Taleb, N. N. (2018). Skin in the game: Hidden asymmetries in daily life. New York: Random House.