Online Courses Across Campus - An Institutional Initiative to Support Online Course Quality Reviews
Concurrent Session 2
Student success in distance courses often lag behind face-to-face sections. Colleges struggle with ways to improve distance learning. This team worked together to create an initiative that promotes quality distance courses by institutionalizing a course design review process for all courses taught at a distance.
LAUNCHING A COLLEGE-WIDE ONLINE COURSE QUALITY REVIEW INITIATIVE
Pardess Mitchell, Associate Professor and Department Chair, Kinesiology & Health Education, Harper College
Stephanie Horton, Associate Professor and Department Chair, English, Harper College
Jenny Henrikson, Instructional and Distance Education Design Specialist, Harper College
Student success in distance courses often lags behind face-to-face sections. The same was true at Harper, a community college which serves more than 25,000 students annually in Chicago’s northwest suburbs. We wanted to move the needle on student success in distance courses both for the benefit of Harper College’s in-district students as well as for the benefit of any future out-of-state students once we begin to market distance degrees nationally. Our plan, as a first step towards this goal, was to launch a college-wide distance course quality review process.
Building the Team
To develop an effective change initiative we needed a group with expertise and leadership in a variety of areas. Therefore, an Online Teaching Practices workgroup was formed, under Harper’s Academic Standards Committee. It consisted of the two instructional designers, the Dean of Teaching Learning & Distance Education, and five full-time faculty members with distance education experience. Our work group’s task was to identify a rubric and process for course quality review, and present them to the Academic Standards Committee for voting through the college’s shared governance policy approval process.
Identifying the Rubric
The first task of the workgroup was to identify the course design rubric that would be used. Several rubrics were considered, but it was the Open SUNY Course Quality Review (OSCQR) rubric that was selected. It was developed in the State University of New York system and was adopted in 2016 by the Online Learning Consortium (OLC) as their recommended rubric. OSCQR is a good fit for Harper because it is research-based, free, customizable, non-evaluative, and easy to use.
The OSCQR Rubric contains 50 standards for quality course design and accessibility. The standards are grouped into six areas of focus: 1) Course Overview & Information; 2) Technology & Tools; 3) Design & Layout; 4) Content & Activities; 5) Interaction; and 6) Assessment & Feedback.
To complete a review using the rubric, a reviewer looks at a course against each standard, and notes whether the standard is sufficiently present in the course or whether the course would require a revision to meet the standard. Peer reviewers are encouraged to recommend specific actions the faculty could take to make any noted revisions. Peer reviewers are also encouraged to provide positive feedback on areas where the course excels.
Training on the Rubric
Once the rubric was chosen, the instructional designers (IDs) trained themselves on setting up the rubric and reviewing courses against the standards. They adjusted some standards to better meet Harper’s needs and identified three priority levels within the rubric standards to guide post-review actions. They then trained the workgroup faculty members through a two-week, online course. Background on OSCQR, resources relating to each standard, and access to a practice course to review were provided. The workgroup members who completed the course became the first Harper faculty peer reviewers.
Piloting the First Reviews
With the workgroup members now trained as OSCQR reviewers, it was time to begin reviewing academic credit courses. Four courses in two online certificates were identified for the pilot.
The IDs set up OSCQR rubrics for each course and created a student-free copy of the selected courses in Harper’s learning management system. (These copied courses contained all of the course content and materials, but did not contain any student data.) The faculty teaching these courses performed a self-review of their course against the standards. Two faculty members of the workgroup were assigned to each course as peer reviewers, giving the self-reviewer two sets of feedback on their course.
After the pilot reviews were completed, the workgroup convened to debrief on their experiences. The group voted unanimously to recommend the implementation of the OSCQR rubric and process to the Academic Standards Committee. They recommended a process where faculty would complete a self-review and receive feedback from one peer-review per course.
Negotiating the Faculty Contract
At the same time the workgroup was completing their pilot, the full-time faculty contract was re-negotiated to state that all distance courses are now subject to design review, and that faculty must go through the review as part of a cycle to remain eligible to teach their distance course.
It is important to note that, while course design reviews are required per the full-time faculty contract, the process is a professional development process and not an evaluation. Faculty members who go through the course design review are encouraged to take into consideration the peer reviewer’s feedback—as well as their own discovery through their use of the rubric—to make modifications they wish to implement. However, there are no requirements regarding making course changes or modifications.
To facilitate understanding of how the proposed OSCQR rubric and process would address this new contract language, two faculty members of the workgroup presented to the Faculty Senate. This gave the opportunity for faculty to ask questions and have any concerns addressed before the final proposal was made to the Academic Standards Committee. Working with the Faculty Senate provided us with some important feedback, such as making sure that the review results were kept confidential between the self-reviewers and their peer reviewers unless express permission was given to share.
Final Vote by Academic Standards Committee
Throughout this process, members of the workgroup presented updates to the Academic Standards Committee. Finally, after the presentation to the Faculty Senate, a faculty work group member presented the OSCQR rubric and process to the full Academic Standards Committee for a final vote. The Committee voted to adopt the rubric and process at Harper.
First Semester of Campus-Wide Online Course Quality Reviews
The Online Teaching Practices workgroup and administrators identified 17 courses to go through the first review cycle during the Fall 2017 semester. Priority was given to courses that were slated to be marketed as part of an online degree program, to high enrollment general education courses taken early in a student’s academic journey, and to courses with a large success rate gap between face-to-face and online sections. Faculty who taught the 17 selected courses within the past two academic years were required to participate.
Additional trained faculty peer reviewers would be required to complete the 17 course reviews required in the first semester. A call was put out to faculty with distance teaching experience. A $150 stipend per peer review was put in place. Through two-week online and in-person training workshops, Harper grew the number of OSCQR peer reviewers from four to 16.
All 17 of the Fall 2017 course reviews were completed, and 26 more are underway this Spring 2018 semester. The number of reviews will be scaled up each semester in order to reach all distance courses within 5 years.
We plan to look at student success criteria to evaluate the process at the end of the first full year of reviews. We plan to compare student success rates in courses before reviews to success rates one semester after the reviews (in order to allow faculty time to make modifications if they wish). Although modifications are not required for reviews to be completed, we also plan to look at the level of changes made to reviewed courses based on action plan feedback (documented in the OSCQR rubrics), as well as at faculty self-reported feedback on the review process. We also plan to look at aggregate data to help guide future professional development offerings.
Starting an online course quality review initiative is quite a big task, but the work group found that including many members of the college, who could all give input, made it possible. Including distance education faculty members, Instructional Designers, Division Deans, Academic Standards Committee members, and Faculty Senate members ensured everyone was on the same page, working together and communicating the same goals.
Other colleges considering implementing a college-wide course design review process may want to consider a non-evaluative, peer-based approach such as the one taken by Harper. Viewing online course design reviews as a professional development opportunity provides a great environment for collaboration and sharing of expertise.
This was Harper’s story. This group is proud of the work done by so many at Harper College to get such an important initiative launched. Harper College is looking forward to a brighter future including increased student success in distance education.
During the Discovery Session we will provide slides and web links to present visuals that support the story of our process. We will ask for audience contributions and questions as well. For those who are thinking of implementing or modifying such a plan at their school, we hope to provide a framework and a few key takeaways to help ensure their success.