Learning Analytics: Perspectives from Diverse Stakeholders

Watch This Session

Brief Abstract

This panel will discuss the results of a multi-site interview study aimed to investigate stakeholders’ perspectives and views surrounding learning analytics in higher education. Topics include: perceptions of data that should be collected, data literacy, barriers to access and use of data, and bias and equity in using learning analytics.


Mary Ellen Dello Stritto is the Director of Research for Oregon State University Ecampus, where she designs and conducts research studies on online teaching and learning, provides support for faculty research on online education, and produces tools to promote research literacy. Her background is in psychology with a specialization in quantitative methodologies, survey design, and statistical analysis.
Rebecca Arlene Thomas is currently the Postdoctoral Scholar of the Oregon State Ecampus Research Unit (ECRU). The ECRU conducts original research in online higher education, and promotes collaboration and research literacy in the field. Before working at Oregon State, Rebecca earned a master's degree in Instructional Psychology & Technology from Brigham Young University and PhD in Psychology from the University of Texas San Antonio. She enjoys conducting research about college student relationships and aggressive behavior in addition to her work in online education.

Additional Authors

Benjamin Croft is an education researcher in online higher education. His professional background spans many roles and levels: an instructor for first year college mathematics, a program coordinator and grant writer, a high school speech and debate coach, and many others. His interests center on diversity, equity, and inclusion within higher education, especially through the lens of institutional strategy, quantitative and qualitative research, and instructional design and pedagogy.
Rob Nyland, PhD. is the eCampus Research and Innovation Team Manager at Boise State University, where he leads a team that performs research in online learning, learning analytics, and OER. He has previously worked as a Learning Engineer for Learning Objects, and as a full-time faculty member of Multimedia Design and Production at Lake Washington Institute of Technology. He is a graduate of Brigham Young University's Instructional Psychology & Technology program, where he did research in learning analytics, OER, and competency-based education.
Allen works as an instructional designer in Wake Forest University’s Office of Online Education and in partnership with the Teaching and Learning Collaborative. With an MA in Teaching from USC - Rossier, he has experience designing, delivering, and taking online courses at several levels of education. His role at WFU involves collaborating with faculty and administrators in course and program development for face-to-face, online, and hybrid learning environments.
Rebecca Heiser is an Instructional Designer for The Pennsylvania State University’s Lifelong Learning and Adult Education online graduate program with World Campus. Her research interests include identity-centered design, inclusive and equity-centered design, online learning communities, learning analytics, informal learning spaces, aesthetic design, open educational resources, and meaningful instructional design strategies in online distance education learning environments.

Extended Abstract

Audience Engagement:

In this panel, four researchers will share an overview of the study and discuss key findings from four areas of analysis: What data should be collected? How data literate are faculty and administrators? What are barriers to access and use of learning analytics data? What concerns do stakeholders have regarding bias and equity in the use of learning analytics? After each panelist’s key findings, attendees will be provided an opportunity to ask questions. At the end of the session, attendees will be asked to share how the findings could be applied at their own institutions.

Learning Objectives:

By attending this session, attendees will:

  1. Summarize the methodology involved in a multi-institutional collaborative research project.
  2. Discuss how diverse stakeholder groups perceive the use of learner data.  
  3. Apply key findings about perceptions of learner data to their own context.


This a multi-site interview study aimed to investigate stakeholders’ perspectives and views surrounding learning analytics in higher education. The study included 59 interviews from students, faculty, instructional designers, data analysts, administrators, academic advisors and coaches, and diversity and inclusion leaders from eight institutions of higher education in the United States.


This research was conducted by a cohort of a research seminars program. The group consisted of two leaders and eight cohort members, all of whom were employed by higher education institutions in the United States.

The research team met in-person in the summer of 2019. During this week, the team conceptualized the qualitative project, began the process of designing interview protocols, planned participant recruitment, and drafted an Institutional Review Board (IRB) application. During the following few weeks, the team worked to obtain approval from all of the required institutions’ IRBs. Once the project was approved, the research team worked to recruit participants and conduct participant interviews over Zoom during the following year. It is important to note that data for the project were collected during the time of COVID-19 (March-November 2020), and participants’ perspectives may have been impacted by the global pandemic.


Fifty-nine total participants were recruited for this study, including 20 students, 10 faculty, 9 instructional designers, 7 data analysts, 5 administrators, 3 academic advisors and coaches, and 5 diversity and inclusion leaders.

Students were recruited from three institutions located in different areas in the United States. At those institutions, team members obtained a random list of 100 eligible students who received recruitment emails. Students were eligible to participate if they were currently enrolled as degree-seeking students with more than one year (2 semesters or 3 quarters, not including summer terms) of experience at the institution. The target enrollment was 6 students from each institution. The team completed 20 student interviews from three institutions.

Faculty were recruited from eight higher education institutions located in different areas of the United States. Faculty were eligible to participate if they were full- or part-time faculty with a minimum of 2 years consecutive teaching experience at the institution (4 semesters or 6 quarters, not including summer terms). The target enrollment was three faculty from each of the 8 institutions. The team completed 10 faculty interviews from 6 institutions.


Prior to completing the interviews, participants completed an anonymous online pre-survey that took participants 5-10 minutes to complete. This survey and asked for participants’ contact information, as well as demographic information.

Each participant completed a 60-minute interview with a member of the research team. Separate interview protocols were created for students, faculty, instructional designers, and the other staff groups (data analysts, administrators, academic advisors and coaches, and diversity and inclusion leaders), with many of the same questions included in all three protocols (ranging from 36-38 questions). The interview protocols covered participants’ perspectives and views surrounding learning analytics in higher education. However, since our research team was unsure of whether all of our participants would understand the meaning of “learning analytics,” question wording throughout the protocol included terms such as “learning and learner data” in place of the term “learning analytics.” The full interview protocols contained the following sections for all participants: 1) definitions and general uses of data in higher education, 2) perceived benefits, helpfulness, and utility of learning and learner data, 3) perceived barriers, challenges, or concerns about learning and learner data, 4) perceptions of privacy, transparency, consent, and autonomy related to data in higher education, and 5) data uses and limitations. Interviews were recorded in Zoom and transcribed for preparation for data analysis.

Data Analysis & Results

The research team divided into sub-groups and selected four research questions to investigate. The following sections summarize the data analyses and results for each:

What data should be collected? This project focused on student and faculty responses to four interview questions that asked for perceptions of “learner” data that should and should not be collected, as well as “instructor” data that should and should not be collected. The data analysis involved   coding   the interview   responses using   holistic   coding   with   an attributional layer, tallying top responses for each stakeholder group, and identifying key take-away messages. Students and faculty agreed that student and instructor satisfaction data “should be paid attention to,” and participants suggested that this data “can influence grades.” Additionally, participants thought teaching performance data should be collected, including information about “what works well and what doesn’t” in regards to pedagogy, and “how the teacher is teaching.” Participants also thought that student engagement data, or “full participation in class,” should be collected in “a variety of ways.” While some participants expressed that student performance data such as grades could be useful, they also suggested that more than grades would be needed to fully understand the student experience. Similarly, while some suggested that student and instructor demographic information is useful to collect, participants were concerned that demographic data could be used to enhance biases.

How data literate are faculty and administrators? It is important to understand stakeholders’ experiences, knowledge, and skills in learning analytics, as data literacy of these stakeholders can relate to the impact that learning analytics can make. This project focused on the level of data literacy of faculty and administrators who participated in the study. The analyses focused on the support that participants indicated they need, their level of confidence with data and analytics, and the skills they believe are most important for effectively using learning data. As well as training in statistics, participants indicated a desire for workshops around best practices, especially the opportunity to learn from peers and see how their colleagues in other departments or offices use learning data. Perhaps because of some skepticism around learning data, participants indicated they would like more knowledge of how learning data are gathered – indicating a more participatory approach to learning data may be more fruitful than one that treats the end-users as consumers. Participants also frequently spoke of the importance of skills not traditionally associated with data analysis, such as open-mindedness and the capacity for self-reflection.

What are barriers to access and use of learning analytics data? This project focused on the perspectives of faculty, instructional designers, and academic advisors/success coaches related to perceived barriers to the access and use of learning analytics data.  These stakeholders were chosen because they are the professionals most directly involved with shaping and supporting the teaching and learning process. A greater understanding of these barriers can help higher education administrators create ways to more strategically leverage learning analytics data.  Participants identified several barriers, including the suggestion that many stakeholders lack data literacy (as well investigated in the above project about data literacy). Additional barriers included the concern that there was no process or strategy in place for the use of learning analytics, and the time and effort that it can take to shape learning analytics into a usable form. Participants also cited concerns that the available data didn’t seem useful or actionable, and expressed a philosophical resistance. Lastly, participants expressed concerns about privacy, security, and misuses of learning analytics data, which related to the above project about data that should be collected.

What concerns do stakeholders have regarding bias and equity in the use of learning analytics? It is important that learning analytics data be analyzed and evaluated responsibly to ensure decisions are made without bias and ethical implications. This project focused on perspectives from students, administrators, and diversity and inclusion officers regarding concern with issues of bias and equity in the uses of learner data. In these analyses, human elements and power dynamics from a systems view were considered when interpreting participant responses. These stakeholder groups were chosen because they represent the opposite ends of the power, and represented diverse perspectives. Findings indicated that 90% of students expressed a degree of concern with issues of bias related to learner and learning data. Conversely, students were less concerned with the degree of equitable decision-making with learning and learner data. However, administrators and diversity and inclusion officers expressed a high degree of concern with issues of bias and equity. Specifically, bias and equity themes emerged with increased frequency throughout many of their interviews.