Mischief Managed: Using LMS Clickstream Data for Pedagogical Allyship

Concurrent Session 4
Streamed Session

Watch This Session

Session Materials

Brief Abstract

Recent stories about LMS clickstream data describe how information is weaponized against students (e.g., ‘catching’ cheating), but it can be a powerful tool for pedagogical allyship. In this session, participants examine what clickstream data does and doesn’t reveal about student engagement and how faculty can leverage this resource for student success.

Extended Abstract


Academic Technologists and Learning Management System Administrators have long known about a relatively gray area in online learning management: Like any database, an LMS records streams of entries about user clicks, page views, and basic interaction with the online content. Technically speaking, these data are quite simple to interpret because they log objective information, such as time markers, elapsed time, mouse clicks, etc. 

In practice, however, this clickstream information is made available in various forms to users with different permissions to view the data; and then, those users who do make use of this available data read and interpret it in vastly subjective ways. For example, some instructors will view user logs as indicators of attendance, while others will interpret elapsed time as evidence of effort (or lack thereof). 

One problem is that the data is not meant to be used to track attendance or catch cheating attempts, but few institutions have education or policy protections in place to help users correct this behavior. It is not uncommon for Academic Technology specialists to receive occasional requests for quiz logs or login data to determine the date of last ‘attendance.’ 

During AY 2020-21, stories like the Dartmouth cheating scandal amplified this issue from a problem to a moral panic, fueled in part by pandemic teaching and learning that sent most higher education courses online in some form with varying levels of online teaching preparation.

There are many ways to unpack the Dartmouth story and others like it—what LMS data is designed to do and not do (limitations); problems of academic integrity and surveillance; the need for stakeholder education around LMS data and learning analytics; the role of policy and procedures for handling user data logged by our systems; the concept of trusting our students or creating an environment that reduces the will to cheat, etc.

In this session, participants will examine what clickstream data does and does not reveal about student engagement and how faculty can leverage this resource for student success. The presenters bring their expertise with academic technology, systems administration, quantitative theory & research, and higher education pedagogy to guide participants through the kinds of stories that clickstream data are meant to convey, as well as ways to bring more context to incomplete understandings of the data. 

Session Overview

In this interactive session, an Academic Technology Administrator and Higher Education Faculty member will share the types of clickstream data commonly available to different user types in an LMS (our focus and examples are in Canvas, but there are commonalities across others, including Moodle and Blackboard). We situate the interpretation of this data within social science research literature, including confirmation bias and troublesome knowledge. We also draw from the collective expertise of OLC attendees to brainstorm strategies for using available clickstream data as an instructional tool that can better serve students, positioning instructors as pedagogical allies.

Many OLC attendees, including instructional technologists and designers, administrators, faculty, and educational developers will relate to this session. Some will relate to stories about requests for user log information that fell outside of any policy protection, but relied on individual moral judgement and decision-making. Others may wish to better understand what LMS clickstream data does or does not communicate about student engagement with online content. All will benefit from a new way of thinking about and sharing ways to make use of the available data for student success. We believe this will be a high-interest topic to attendees.

Session Goals

By participating in this session, participants will: 

  • Identify what kind of user data is commonly available to users with ‘instructor’ and ‘student’ roles in a Learning Management System.
  • Understand and explain what information the LMS logs do and do not communicate about student engagement.
  • List ways to use clickstream data to help students be successful in academic classes.

These goals are relevant to both instructors and professionals responsible for educational development and instructional support.

Session Plan
  1. Welcome / Anonymous poll about dispositions toward cheating.
  2. Some research on cheating during the pandemic
    • Participants relate or react to findings.
  3. Theory - Confirmation Bias
    • Demonstration of data - solicit possible ‘stories’/explanations from participants
  4. What does it mean to be a pedagogical ally?
    • Presenters front load with ~2-3 examples
    • Participants - Brainstorm instructional moves
  5. Presenter - tour of Canvas logs; what they do and do not tell us
  6. Returning to allyship - Case Study #1: Share screenshot of Canvas login activity
    • Illustrative story - What instructors see / being transparent
    • Student’s story - working with others; low appearance of engagement
  7. Case #2: Share screenshot of Canvas Quiz log
  8. Presenter Recommendations for pedagogical allyship using clickstream data
  9. What questions does this raise for you? What ah-has? What strategies for using clickstream data for allyship in addition to the ones we shared here?
Level of Participation

This session includes moments for audience participation throughout:

  • An initial poll will reveal collective dispositions toward academic honesty
  • An open discussion (in chat or unmuted audio; raised hand) will invite stories and reactions to pandemic research.
  • Participants will share expertise and ideas about instructional moves on a common Padlet that they will continue to have access to following the session.
  • Participants will participate in an exercise that asks them to analyze example data and share possible explanations for the implied information.