Leveraging Predictive Analytics to Support Student Success

Concurrent Session 4
Streamed Session

Watch This Session

Session Materials

Brief Abstract

The unlikely but true story of a pilot project that combines the relatively new Blackboard Predict machine learning model with a home-grown early alert system with a track record of more than 20 years, to improve institutional effectiveness and efficiency by helping identify at-risk students to support their academic success.


Dr. Penniston has been involved with online and blended learning in different capacities, including as a student, instructor, builder, and administrator, for the past decade. He has extensive experience with quantitative, qualitative, and mixed methods designs. He provides consultation to faculty and staff on working with institutional data, gathering data, and conducting data analyses. Dr. Penniston also evaluates DoIT outcomes and survey feedback to inform empirically based best practices, spearheads and administers screencasting use at the university, collaborates on institutional predictive analytics projects, and supports faculty course hybridization in alignment with Quality Matters Standards.

Extended Abstract

Some successes are more probable than others. Using predictive analytics to support persistence and academic success, for example, is a highly desirable and sought-after capability in higher education, but there are logistical pitfalls and ethical considerations that complicate a fairy tale ending. The University of Maryland, Baltimore County (UMBC) has been able to overcome these significant obstacles to pilot an intervention informed by big data.

UMBC was an early adopter of Blackboard Predict and is piloting the machine-learning algorithm to identify students at risk of receiving a D, F, or W for a term grade. LMS data were included in the model as real-time proxies for student engagement. Initial findings suggested that it might be possible to leverage the predictions in conjunction with the university’s First Year Intervention program (FYI) to accelerate an intervention that typically takes place at approximately the midterm. The university has made use of the FYI system for more than 20 years as a means of identifying academically at-risk first year students during a semester and providing support intervention early enough to provide academic benefits (e.g., referral to the campus Learning Resource Center). This system, however, is opt-in and although about 65% of all eligible courses participate in the program, there are some gaps in coverage in high enrollment gateway courses (i.e., those courses where low academic achievement might negatively impact progression toward graduation).

UMBC conducted an internal analysis to cross-validate the Bb Predict algorithm’s student-course level predictions against the data we have on students who receive FYIs. We sampled a group of high-enrollment, high-LMS use courses and found Bb Predict data tended to be better at identifying students who will pass, while the FYI was better at identifying who will fail. The findings suggested the two processes operating together can simultaneously improve precision and expand coverage by casting a wider net to include students in courses that might not otherwise receive a notification through the FYI. These results prompted a proof of concept deployment of the Bb Predict system along with the FYI, so the Bb predictions were used to identify students at midterm – along with the FYI – to receive a supplementary notification. Findings from this project support the expansion of the pilot to include additional courses/programs.

We will use a variety of strategies to engage the audience throughout the presentation and during the 10-minute Q&A/group discussion using. Specifically, a game show format along with responseware will be used to help participants interpret and make these experiences relevant to their own institutional contexts.