Public Course Reviews as Peer Advising: Authorship and Usage

Streamed Session

Brief Abstract

In this session, we will present data on our research on students' creation and use of course reviews as a form of peer advising, and facilitate a conversation among representatives from other universities about their use of the same.


David Joyner is the Associate Director for Student Experience in Georgia Tech's College of Computing, overseeing the administration of the college's online Master of Science in Computer Science program as well as its new online undergraduate offerings. He has developed and teaches CS6460: Educational Technology, CS6750: Human-Computer Interaction, and CS1301: Introduction to Computing, all online.

Additional Authors

Extended Abstract

As part of the growth in online education, students have developed their own platforms for sharing their evaluations of courses over the Internet. These are typically focused on the needs and experience of students in traditional undergraduate programs, but the rise of online programs operating at scale has made it practical for students to develop such a platform dedicated to their particular program. We have used a survey to gather information from students in such a program at a major research institution in the United States. Through this data we explore how many students are using the site, how they use the information, and also how often and why they write reviews. The ultimate goal is to gather information that could help students to decide how to critically assess such reviews and successfully use them to make better decisions.

Past research on the ways that students evaluate their courses has focused largely on how those evaluations relate to the specific course instructor. This research examines a set of data from a public site where students unofficially rate the courses in a very large online graduate program operating at scale. We examine the relationship between the unofficial scores students give to their classes with data on enrollment trends over time and the assessment strategies used within the courses themselves to examine additional factors that shape the ratings students choose, as well as how they use those ratings to choose what courses to take in the future. We find several different notable relationships: reviews in this context are largely impervious to the extreme response bias prevalent on other review sites; review content does not appear to significantly influence enrollment trends; more difficult classes tend to receive more favorable ratings overall, although individual students do not rate difficult classes more favorably; and project-based classes are perceived by students to be less difficult.

In this discovery session, we will briefly present out work investigating how students create and use course reviews. We use these findings as the foundation for a discussion on how course reviews are used at other universities and institutions, including whether they are public or private and official or unofficial. Based on this, we will collaboratively discern a set of best practices and recommendations for future use of student course reviews as a form of peer advising.