Got Data - Now What?: Synthesizing Outcomes, Measures, and Success to Inform Institution-Wide Decision Making

Concurrent Session 3
Streamed Session

Watch This Session

Session Materials

Brief Abstract

This session defines student success through a gamified exercise, delving into technologies that collect and manage data, and preparing participants to lead data discussions. Our dashboarding activity challenges attendees to classify student success indicators, practice bridging direct and indirect evidence, and collaborate to synthesize meaning from real-world institutional problems.

 

Presenters

Dr. Sherri N. Braxton is the Senior Director of Instructional Technology at UMBC where she is responsible for leading the Division of Information Technology’s (DoIT) strategy for end-user support of instructional technologies including online, hybrid, and traditional, “face-to-face” technologies. With over 20 years of experience in traditional classroom instruction and adult education strategies grounded in instructional design models, she also possesses over 17 years of experience using learning technologies in higher education settings, including the design and facilitation of online and hybrid courses. Dr. Braxton is a dynamic presenter known for her ability to engage audiences and capture their attention, even for highly complex topics. She collaborates with her staff to devise learning opportunities delivered in multiple modes that meet the varied and shifting needs of both UMBC faculty and students. Dr. Braxton is also the DoIT representative on the University System of Maryland (USM) Academic Transformation Advisory Council, a group spearheaded by the William E. Kirwan Center for Academic Innovation. Dr. Braxton has crafted a national presence through her participation in educational technology associations like EDUCAUSE, the Online Learning Consortium (OLC), and the IMS Global Learning Consortium; in addition to presenting at national, regional, and local conferences, she serves as a proposal reviewer, constituent group leader, leadership institute faculty, and both task force leader and working group participant. Dr. Braxton earned a Doctor of Science in Computer Science with Minors in Educational Leadership and Management Science from the George Washington University. She also holds a Master of Science in Computer Science with a Math Minor from North Carolina State University and a Bachelor of Science degree in Mathematics with a minor in Computer Science from Wake Forest University.
Jennifer M. Harrison has worked in higher education for almost 30 years and is currently UMBC’s Associate Director for Assessment in the Faculty Development Center. She has expertise in accreditation, institutional effectiveness, student learning assessment, critical pedagogy, curriculum development, educational technology, and online and face-to-face active learning. She currently specializes in interdisciplinary educational development. An experienced speaker, she has created hundreds of workshops, programs, and presentations for a range of higher education audiences, including national, regional, and local conferences. At UMBC, she consults with faculty and staff to strengthen learning assessment practices and offers programs and workshops to support faculty development. She was key contributor to UMBC’s successful re-accreditation efforts and continues to work with faculty, staff, and leaders to support authentic assessment. Before joining UMBC, she served the labor movement for 15 years at the National Labor College, crafting interdisciplinary writing, research, and critical thinking curricula; leading faculty development, prior learning assessment, and educational technology processes; cultivating strategic, institutional effectiveness, and learning assessment plans, and successfully contributing to re-accreditation as Associate Professor of Writing and Director of Assessment. After earning tenure, she chaired the admissions committee, brokering a FIPSE grant into a redesigned student-success oriented matriculation process designed to integrate with prior learning assessment and improve graduation rates; redesigned the capstone program; crafted key policy documents; and contributed to continuous improvement initiatives by founding and chairing the Assessment Committee. Dr. Harrison holds an interdisciplinary Ph.D. in Language, Literacy, and Culture from UMBC, a master’s degree in English Language and Literature from University of Maryland, College Park, and a bachelor’s degree in English with a minor in art from Washington College. Her current research focuses on authentic assessment, including inclusive curriculum mapping and design; graduate, co-curricular, and interdisciplinary assessment; assessment technologies; and the benefits of contextualizing learning analytics with direct learning evidence.

Extended Abstract

Student success is central to the national conversation about higher education--in fact, 95 percent of college presidents indicate that “declining public support” for higher education results from “concerns over whether higher education prepares students for careers” (Lederman, 2018, March 9). Yet, institutions struggle to meaningfully use evidence to demonstrate more nuanced contributions to student learning and success. Further, assessment experts admit that student learning data is not being used effectively. In fact, even when academics do implement evidence-based interventions, these interventions may not respond meaningfully to student learning needs if they fail to connect learning data to indirect evidence that contextualizes the results (Eubanks 2017). Conversely, learning analytics experts seek meaning through vast arrays of indirect data, often overlooking the analytical depth of direct evidence. Yet when we triangulate multiple data points, we can begin to answer vital questions about student learning.

When institutions bridge student learning outcomes and success data, they gain capacity to aggregate, disaggregate, and reaggregate learning data to answer targeted questions about student learning, achievement gaps, and intervention effectiveness. Integrating direct and indirect evidence yields deeper understanding of student performance, adds depth and nuance to predictive analytics, and offers insights that can foster equity through enhanced capabilities to to pinpoint achievement gaps. Our goal is to help faculty, staff, and other campus leaders create a culture of data-informed decision making, empowering them to synthesize evidence into decisions that foster inclusive excellence.

How can we help our institutions create a data-enabled culture that improves decision making and student success? How can we help educators identify key student success indicators and demonstrate success to multiple audiences? Where is the information stored? How can data be synthesized for more precise analysis? Our session begins to answer these questions by challenging attendees to define student success, delve into the technologies that collect and manage the data, and prepare to lead their faculty in effective data discussions. We invite attendees to practice bridging direct and indirect evidence, using practice scenarios that they can adapt to teach effective practices at their own institutions.

Our session engages participants in a scaffolded, collaborative exercise series where they co-construct definitions of student success and create connections to the technology tools that store and manipulate the data. Attendees gain experiential learning by interacting with both physical and conceptual representations of student success dashboards in group and individual activities. We designed each activity using research-based learning strategies to help attendees internalize, retain, adapt, and teach key concepts to colleagues at their own institutions.

  1. Our opening activity exposes attendees to key concepts, while establishing a common baseline of knowledge across the audience and providing a low stakes introduction to the key performance indicators of student success.

  2. Our central activity adapts ideas from gamification and kinesthetic learning to challenge participants to collaboratively create a student success dashboard in response to a scenario that includes direct and indirect evidence.

  3. Our closing activity, a minute paper, will help attendees reflect and assess their own learning.

Additionally, we will use PowerPoint slides and handouts to share our learning outcomes (including a curriculum map of the session outcomes and opportunities); present (and visually unpack) key points; and offer templates and tools for attendees to use at their own institutions.

Outcomes

By the end of this session, participants will be able to …

Create a dashboard of student success indicators as a visual guide to model effective data integration

Collaborate to analyze those indicators and practice facilitating closing-the-loop discussions that yield effective interventions through scenarios that include multiple direct and indirect data points

Identify gaps in your institution’s assessment processes and technologies that hinder the data collection, analysis, and aggregation required to take outcome data to scale

 

Selected References

Bishop, M.J., Braxton-Lieber, S., & Harrison, J.M. (2017, April 19). Exploring assessment technologies. Univ System of Maryland Symposium.

Eubanks, D. (2017 Fall). A guide for the perplexed. AAHLE Intersection, 4-14. http://c.ymcdn.com/sites/www.aalhe.org/resource/resmgr/docs/Int/AAHLE_Fall_2017_Intersection.pdf

Harrison, J.M., & Braxton, S.N. (2018). Identifying effective assessment technologies. Poster session presented at the 2018 Educause Learning Initiative (ELI) Annual Meeting, New Orleans.