Ensuring quality and consistency in online Health Professions Education courses using OSCQR
Concurrent Session 1
An ambitious plan to review all the courses in a new HPE program within 18 months is being piloted at my university. This presentation discusses the customization of OSCQR, getting faculty buy in for course reviews, and setting up a program wide dashboard to track alignment with the rubric.
At the end of this session, participants will be able to:
- Discuss the value of customizing OSCQR
- Consider how to customize OSCQR to suit unique institutional needs
- Address issues of faculty buy-in
The session will be interactive and organized as follows:
- 5 mins: Introductions
- 10 mins: Introduction to OSCQR and open discussion of the elements of OSCQR
- 10 mins: Free discussion on challenges experienced when implementing a course review process
- 15 mins: Presentation of the case at my university and lessons learned
- 5 mins: Q&A
Health Professions Education (HPE) is a field that is slowly moving online. As new online programs come up, it is important to have an effective course review plan. The HPE program at my university is spearheading the university wide distance learning initiative and it is critical to show quality and consistency in our courses.
As an institutional member of the OLC, it was an automatic choice to use OSCQR as our preferred rubric. The flexibility of the tool, the support resources provided with the tool, and the administrative dashboard that accompanied OSCQR met the requirements of our program.
The online learning team at the HPE program reviewed each element of OSCQR and assessed its priority in terms of ‘Essential’ and ‘Important’ elements. Main categories were rearranged, and some elements were combined while others were further expanded. A draft of the revised OSCQR was sent to all faculty and passed unanimously at the faculty meeting.
In order to ensure timely review of courses and to provide faculty with enough time to make changes to their courses, a detailed implementation plan was drawn up. Each course was reviewed by the course faculty, a member of the online learning team, a peer faculty reviewer and a student reviewer. Faculty gained familiarity with the tool by reviewing their own courses and then acted as peer reviewers for other courses in the program. Course reviews were initiated upon completion of a course offering with a review timeline of 2 months. To ensure that the focus of the process lay on reviewing the courses and was not misconstrued as evaluating faculty, individual course results were available only to the online learning team. The program director was only provided with a summary view of program compliance with the OSCQR rubrics.
Two challenges were encountered through this process. Firstly, as a university affiliated with the Department of Defense, security measures precluded the use of the free OSCQR tracking dashboard provided by SUNY. Therefore, the SUNY dashboard was studied and used as a template to develop our own version of the dashboard to track the courses in our program. Developing our own dashboard also allowed us to tailor it to our unique requirements.
Our second challenge was that faculty experienced anxiety and stress when their courses came up for review. Most of the faculty in the program were new to online teaching which exacerbated their anxiety. One to one meetings with the faculty, transparency in the process, and privacy of information helped to allay some of the anxiety. Faculty were also given full control over deciding how to revise their courses and support was provided by the online learning team.
This is an ongoing initiative and this session will present lessons learned and best practices in conducting course reviews for quality assurance.