The Touro Rubric for Online Education: A Dynamic Tool for Evaluating Online Courses
Concurrent Session 8
An online course assessment rubric was developed as a formative evaluative tool for peer review, mentoring and course redesign to assist faculty in providing high quality, engaging online student learning experiences. In this highly interactive session, participants will use the rubric to evaluate and share feedback on an online course.
Currently, there are a number of popular rubrics that are used to evaluate online courses. These rubrics are often expensive to use and require extended training. This session will focus on a new rubric developed by the Online Education task force at Touro College over the last year. Training is short because the rubric is designed to be intuitive, user-friendly and easy to learn to use. The Touro Rubric for Online Education is intended to be a formative tool so that expert reviewers can evaluate courses and then faculty peers can work with online faculty to redesign online courses to build toward excellence. Data on inter-rater reliability and early results on the effectiveness of the tool will be shared.
The Touro Rubric for Online Education has six categories: Course Design, Course Logic, Instructor Presence, Learner Engagement, Learner Assessment and Course Technology. Each category consists of between four and fifteen items. Reviewers evaluate courses by giving a score of one (needs improvement); three (meets expectations) or five (exceeds expectations) on each item in the Rubric.
The goal in developing the rubric was to build a tool that was intuitive and could be used by different evaluators. For it to generalize to different evaluators, the category definition needed to be defined, specific and measurable. That meant that terminology such as “important information” could not be used. Clearly, the Online Education Task Force recognized that important information was too subjective and different reviewers might have disparate views on what was or what was not important. Instead, target information was specified. For example, instead of important information, the rubric stated “due dates” or “length of discussion board post.”
Similarly, early pilot testing indicated that terms familiar to the Online Education Task Force might not be familiar to faculty. For example, the term infographic was not familiar to everyone who was part of the pilot test faculty reviewers. Additionally, while it was clear that online faculty might want to “engage students actively”, it would be difficult to define what that meant in an online learning context. Instead evaluation criteria such as “poses open-ended discussion questions for the discussion board” are used.
In this highly interactive session, participants will be introduced to The Touro Rubric for Online Education. During the session, participants will be asked to use one section of the rubric to evaluate a sample online course. Participants will share feedback on the rubric at that time. Each participant will also be given a copy of the rubric and encouraged to use the rubric for their own courses and share feedback and comments via a dedicated online site so that the Rubric becomes a dynamic instrument that can evolve as technology changes and as online pedagogy evolves.
As the norm in university education becomes a combination of classroom, online and hybrid courses, excellence in online courses is fundamental to effectively educating students. We believe that the Touro Rubric for Online Education will enable faculty to be more successful in online course design and pedagogy. More than that, by crowdsourcing feedback on the rubric as it is developed, we hope to create an instrument that is more valuable in the marketplace and more useful in a wide range of academic settings.
This program is designed for all audiences with a focus on Higher Ed at the graduate and undergraduate levels. Participants will learn about a new, user-friendly rubric for evaluating online education courses. Conference attendees will share feedback on the rubric and receive copies of the rubric.