Scoring the Scorecards

Inside Higher Ed | August 30, 2016 - After spending, in the words of educational technology manager Gus Roque, “a lot OLC_Endorsed_slider_600x575of work and … a lot of money” to redesign hundreds of courses according to Quality Matters’s standards for high-quality online education, Florida International University this spring decided to take a step back.

“We figured we’d do some research and see if it’s worth our time,” Roque said in an interview. 

The result was a report released last month that compared 29 online course sections that had been redesigned according to best practices to 664 online sections that hadn’t. Across all the metrics the university’s researchers looked at -- from the time students spent in the online course to how they evaluated the course and the grades they earned -- the redesigned courses posted better results.

The findings only concern one semester’s worth of courses at one university, but they are an encouraging sign for Quality Matters, as well as for the more than 900 institutions that subscribe to the organization. The Maryland-based nonprofit is one of a number of groups claiming their quality evaluation programs give online education providers a process, separate from accreditation, through which they can review and promote their online programs. FIU’s findings offer some early evidence that those processes can lead to measurable improvements for faculty members and students.

While FIU saw gains across the board, the redesigned courses stood out from the others particularly in the area of content. The redesigned courses featured on average 69 percent more tool items -- blogs, discussion forums, journals and wikis -- 51 percent more course content (defined as explanations, instructions and links to technical support) and 58 percent more assessment items, such as assignments, tests and quizzes.

Perhaps because of the additional content, students spent more time in the redesigned courses. On average, they logged in 10 percent more often, clicked on 16 percent more links and items, turned in 19 percent more submissions and spent 177 more minutes in the courses during the semester. In their course evaluations, the students rated the redesigned courses higher than the other courses on all points, from their opinion of the instructor (up 12 percent) to the courses’ ability to keep them interested (up 11 percent).

The students’ grades were also “a fraction higher” in the redesigned courses, said Roque. He credited the course redesign process with helping faculty members focus on aligning assignments with course objectives.

“If you lay out an objective, that objective is being met by an assessment,” Roque, who is also an adjunct instructor at the university, said. “That paper, that assignment, that blog -- all of it has to feed back to the objectives.”

Rubrics created by Quality Matters and university systems in states such as California and New York focus mainly on course design. Other evaluation programs, such the one offered by the Online Learning Consortium, take a more holistic view, reviewing areas such as faculty training and technology support. Colleges often use several rubrics and programs to diagnose the overall health of their programs.

“We all work well together,” Jennifer Mathes, the OLC’s director of strategic partnerships, said in an interview. “In terms of online education, there’s obviously a need to validate the quality of a program.… Rubrics help us achieve that in higher education.”

Mathes said she was not aware of a college that has conducted a study of the OLC’s program, known as the Quality Scorecard, that is similar to FIU’s research. She pointed to the organization’s work with online education experts, as well as feedback from members, as signs that the scorecard is helping colleges improve their online programs. She estimated more than 500 colleges have used the OLC’s interactive scorecard, while an unknown number of others have downloaded the free paper version (the organization does not track those downloads).

The OLC also offers a peer-review process similar to an accreditation site visit for members who want an outside consultation. Through that process, the organization has recognized four institutions -- Baker College, Southern Nazarene University, the University of Alabama at Birmingham and the University of Wisconsin at La Crosse -- for their “exemplary” online programs, an endorsement reserved for colleges that earn a high score across the scorecard’s nine sections.

“We feel like we have a really high-quality program, but what feel -- let’s admit it -- doesn’t matter,” Jill Langen, president of Baker College Online, said in an interview. “We were looking for an external partner that is well-respected and has a great deal of expertise to come in and look at our offerings … to tell us if we’re as good as we think.”

Baker, which along with UW-La Crosse was named an “exemplary” institution in June, has not conducted research similar to FIU’s. But anecdotally, Langen said, the OLC’s review process helped the college identify areas where it could improve.

For example, Baker received high marks for its student services, but the review process highlighted that students might have a difficult time finding them on the college’s website -- a simple but important fix, Langen said. The consultants performing the review also suggested ways Baker could make its courses more accessible for students with disabilities and improve its record keeping, she said.

Langer said Baker decided to go through the peer-review process so that the college would be able to show an independently verified seal of approval to prospective students who may be deciding between it and other institutions.

“We have to recognize that students are consumers, and that they are going to become much more educated,” Langen said. “We, as institutions that provide online education, need to very aggressively utilize Quality Matters or organizations like the OLC so that we have some kind of external, standardized benchmarks that help us show others that we have quality, and that we hold ourselves to our standards.”

SOURCE: Inside Higher Ed