The Touro Rubric for Online Education: A Dynamic Tool for Evaluating Online Courses

Concurrent Session 8
Streamed Session Best in Strand

Watch This Session

Session Materials

Brief Abstract

An online course assessment rubric was developed as a formative evaluative tool for peer review, mentoring and course redesign to assist faculty in providing high quality, engaging online student learning experiences. In this highly interactive session, participants will use the rubric to evaluate and share feedback on an online course. 


Marian Stoltz-Loike, PhD Vice President, Online Education Dean, Lander College for Women-The Anna Ruth and Mark Hasten School As vice president of online education, Dr. Stoltz-Loike oversees Touro College’s full range of online offerings. Dr. Stoltz-Loike initiated a plan of building toward excellence in online education by building greater strategic and tactical collaboration across graduate and professional programs and creating consistency across online courses. Dr. Stoltz-Loike is the dean of Touro’s Lander College for Women/The Anna Ruth and Mark Hasten School (LCW) for a decade. LCW has enjoyed unprecedented growth in both number of students and quality of academic offerings during her tenure. She has introduced several honors programs for academically talented women, expanded STEM offerings in math and computer science and new programs in education. A professor of psychology and human resources management, she has served as a global corporate consultant with Fortune 100 companies in the US, Canada, Mexico, Europe, Asia and South America on how to build better strategies for using technology to simplify communication across borders and enable multinational businesses to work more effectively in a 24/7 world. She has written two books and over fifty articles relating to diversity, work/life issues, cross-cultural management and the maturing workforce. She has delivered presentations to over forty industry groups at domestic and international conferences on women’s career issues; building effective global business strategies; work-life balance; the impact of technology in the workplace; managing global teams; and generational diversity. Dr. Stoltz-Loike received a bachelor’s degree cum laude in Psychology and Social Relations from Harvard University, and a Ph.D. in Experimental Psychology with a focus on developmental psychology from New York University.

Extended Abstract

Currently, there are a number of popular rubrics that are used to evaluate online courses.  These rubrics are often expensive to use and require extended training.  This session will focus on a new rubric developed by the Online Education task force at Touro College over the last year.  Training is short   because the rubric is designed to be intuitive, user-friendly and easy to learn to use.  The Touro Rubric for Online Education is intended to be a formative tool so that expert reviewers can evaluate courses and then faculty peers can work with online faculty to redesign online courses to build toward excellence.  Data on inter-rater reliability and early results on the effectiveness of the tool will be shared.

The Touro Rubric for Online Education has six categories: Course Design, Course Logic, Instructor Presence, Learner Engagement, Learner Assessment and Course Technology.  Each category consists of between four and fifteen items.  Reviewers evaluate courses by giving a score of one (needs improvement); three (meets expectations) or five (exceeds expectations) on each item in the Rubric. 

The goal in developing the rubric was to build a tool that was intuitive and could be used by different evaluators.  For it to generalize to different evaluators, the category definition needed to be defined, specific and measurable.  That meant that terminology such as “important information” could not be used.  Clearly, the Online Education Task Force recognized that important information was too subjective and different reviewers might have disparate views on what was or what was not important.  Instead, target information was specified.  For example, instead of important information, the rubric stated “due dates” or “length of discussion board post.” 

Similarly, early pilot testing indicated that terms familiar to the Online Education Task Force might not be familiar to faculty.  For example, the term infographic was not familiar to everyone who was part of the pilot test faculty reviewers.  Additionally, while it was clear that online faculty might want to “engage students actively”,   it would be difficult to define what that meant in an online learning context.  Instead evaluation criteria such as “poses open-ended discussion questions for the discussion board” are used.

In this highly interactive session, participants will be introduced to The Touro Rubric for Online Education.  During  the session, participants will be asked to use one section of the rubric to evaluate a sample online course.  Participants will share feedback on the rubric at that time. Each participant will also be given a copy of the rubric and encouraged to use the rubric for their own courses and share feedback and comments via a dedicated online site so that the Rubric becomes a dynamic instrument that can evolve as technology changes and as online pedagogy evolves.  

As the norm in university education becomes a combination of classroom, online and hybrid courses, excellence in online courses is fundamental to effectively educating students.  We believe that the Touro Rubric for Online Education will enable faculty to be more successful in online course design and pedagogy. More than that, by crowdsourcing feedback on the rubric as it is developed, we hope to create an instrument that is more valuable in the marketplace and more useful in a wide range of academic settings.

This program is designed for all audiences with a focus on Higher Ed at the graduate and undergraduate levels.   Participants will learn about a new, user-friendly rubric for evaluating online education courses. Conference attendees will share feedback on the rubric and receive copies of the rubric.