Using Learning Analytics To Optimize Student Success And Institutional Effectiveness

Concurrent Session 4

Brief Abstract

This session is an interactive case discussion involving learning analytics and how they can be used and acted upon. We will use Ekowo & Palmer’s (2017) guiding practices to help frame our thinking around how to build analytical models and interventions that also meet legal and ethical standards.


Dr. Christopher Sessums is the Director of Academic Affairs at D2L working with higher education institutions across the globe to optimize student success and institutional effectiveness. Prior to joining D2L, Christopher served as a faculty member, researcher, and administrator at the Johns Hopkins University’s School of Education, at the University of California, Berkeley, and at the University of Florida. His research interests include assessment strategies, data and analytics, alternative learning models, lean business strategies, and educational policy.

Extended Abstract

The combined emergence and popularity of digital technologies is substantially changing many classroom experiences and creating an upsurge of student data. These student data sets, in turn, offer a raft of prospective evidence about how students learn (Polonetsky & Jerome, 2014). Logically it follows that such evidence could then be used to design interventions, processes, tools, that can literally optimize student achievement and institutional success. So what are we waiting for?

Given this raft of data, educators, administrators, and educational institutions are now faced with such questions as: What do we do with these data we collect? In what ways should we be systematically collecting and analyzing student data? And in what ways can these data be used to optimize student success and institutional effectiveness?

Given the hype and ethical concerns around predictive learning analytics, this session aims to examine the ways in which student data can be used to address various student academic concerns and institutional shortcomings (Selwyn, 2015).

To do this, participants will be asked to engage in a discussion around the Case of the Online Computer Programming Teaching Certificate. In this case, we will be working with a young online Program Director who’s boss is adamant that she “measure everything that moves.” Participants will act as consultants, helping the Certificate’s Program Director think through how she can use student data and information to address the certificate’s low completion rates.

We will use elements of Ekowo & Palmer (2017) recently published guidelines to help us frame our thinking around how to think about analytics as a means for designing analytic models, learning from the results, and designing meaningful interventions. We will also conduct a brief yet important review of legal and ethical considerations (e.g., discrimination, privacy, security, and transparency).

Session Discussion Questions Discussion questions for this session include:

1. What data can we legally collect? [And Why is this an important question to ask?]

2. What activity data offers the most insight into student knowledge, skills, behaviors?

3. In what ways can this data be used to predict student success?

4. In what ways can this data be used to optimize institutional efficiencies?

5. What legal and ethical considerations need to be taken into account regarding student data and information privacy laws?

The session will close with the identification of key takeaways gathered in this session.

By engaging in an interactive case discussion, participants will have an opportunity to gain a deeper understanding of learning analytics, how they can be used, and additional insights to consider when designing a learning analytics plan.


Ekowo, M. & Palmer, I. (2017). Predictive analytics in higher education: Five guiding practices for ethical use. Retrieved from

Polonetsky, J., & Jerome, J. (2014). Student data: Trust, Transparency, and the role of consent. Retrieved from

Selwyn, N. (2015). Data entry: Towards the critical study of digital data and education. Learning, Media and Technology, 40(1), 64-82. doi:10.1080/17439884.2014.921628