Student-Driven Quality Assurance Practices in Online Education: Centering Student Advisors in Design Innovation

Streamed Session Equity and Inclusion

Brief Abstract

All too often, online course quality is determined by a course’s compliance with industry standards rather than consultation with actual students. In this presentation, we share a case study of student-driven quality assurance practices recently developed in a University Center for Assessment, Teaching, and Technology.

Extended Abstract

When it comes to quality assurance in online and blended courses, instructional designers often utilize nationally recognized quality assurance rubrics (e.g., Quality Matters, The SUNY Online Course Quality Review Rubric) or in-house rubrics developed to meet standardized benchmarks determined mainly by researchers and practitioners. These rubrics are important course evaluation instruments because they synthesize the most up-to-date research in the fields of instructional design and instructional technology, and they guide educators to apply these findings to their own courses; they also usually include criteria that support student-centered learning and student engagement. However, the direct voice of the students who take online courses is often left out of conversations about quality assurance. If students are not consulted about their perceptions of online course quality, the overall inclusiveness of the course may be diminished.

At one southwestern university, the Quality Assurance Instructional Designer has partnered with the Senior Instructional Technologist to address this problem. Together, they elicit the student voice from members of the iCourse Student Advisory Board. All main campus students who register for online courses pay a $50 iCourse fee, which funds essential and specialized services and resources for online courses, including personnel, instructional technology, student services, and quality assurance. The iCourse Student Advisory Board is composed of undergraduate and graduate student representatives from diverse backgrounds and academic programs. Throughout the academic year, these student advisors meet regularly with the Senior Instructional Technologist, the Associate Vice Provost of Digital Learning Initiatives and Online Education, and a small group of faculty and administrative advisors, to whom they offer feedback regarding the use of funds associated with the icourse fee.

Student advisors offer feedback on learner engagement and student success in online learning, and their advice is then applied directly to quality assurance practices for online course evaluation and continuous improvement. The Quality Assurance Instructional Designer works with the Senior Technologist to gather and analyze student advisors’ feedback and determine how this feedback could be applied to online course design practices to maximize online course quality within the institution. This student feedback data, delivered in the form of online course examples, student testimonials, consultation practices, and teacher guides, supplements the existing Quality Assurance practices, which involve the application of the Quality Matters Higher Education Rubric (6th Edition). During this presentation, we will share the story of how we developed a process to apply the student voice to our quality assurance practices.

Audience Participation:

After sharing our story, presenters will offer some interesting findings from our student advisor feedback. However, we will present these findings as part of a game in which audience members guess which aspect of course technology students preferred and why. This has the potential to showcase the importance of incorporating student voices, as audience guesses will not necessarily align with student responses. We intend to make this a team game, in which one representative enters their team’s response after a short informal discussion. They will receive immediate feedback (a la "game show") and then will be instructed to jot down their initial reflection. Once everyone has a chance to answer and reflect, presenters will guide attendees to develop some broad goals and ideas for including students in their course development processes. They will also discuss context dependent variables which may impact this process at their own universities.

What Attendees Will Gain:

Attendees will leave the session inspired to create their own student-driven quality assurance practices at their respective institutions. They will cultivate a list of actionable ideas from the interactive game and subsequent discussion. During the game, presenters will share some surprising yet authentic student feedback which may shift attendees’ perspectives on what students expect, and hope for, in their online courses.