Obtaining an Exemplary Online Program Endorsement: A Case Study

University of West Georgia - Because funding for many higher education institutions is based on student persistence and graduation rates, and because competition for online student enrollments is fierce among public and private institutions, quality and competitiveness in online course development, instruction, and administration is imperative. Additionally, because distance education processes and policies are reviewed by accrediting agencies, actions taken to improve quality are more likely to receive favorable reviews. In this paper, we provide a case study on the successful completion of the Online Learning Consortium (OLC) Quality Scorecard for the Administration of Online Programs by Middle Tennessee State University (MTSU). We highlight ways that other institutions can effectively use such an opportunity as they consider applying for certification of their online programs. In particular, we review MTSU’s approach to completion of the scorecard, the outcome of the process, improvements made, and lessons learned.

Obtaining an Exemplary Online Program Endorsement: A Case Study

Middle Tennessee State University is a large public comprehensive institution in Murfreesboro, TN, with over 22,000 students and over 1000 full-time faculty members. MTSU was under the governance of the Tennessee Board of Regents until 2016 when the Tennessee legislature created independent Boards of Trustees for MTSU and the other four year public Tennessee institutions. MTSU Online began in 1997 with seven online courses and 53 student enrollments. The institution now averages over 550 online course offerings and more than 10,000 student enrollments during the fall and spring academic terms. MTSU Online has an inventory of over 600 online courses and offers 14 online and 6 hybrid degree programs. MTSU Online, in University College, includes Online Faculty and Student Services, Distance Education Test Center and a staff of six (two managers, three coordinators and a technical clerk). For additional information about the history and development of MTSU Online, see Rust, Brinthaupt, and Adams (2017).

Since its inception and as the program grew, MTSU Online has focused on comprehensive assessment of courses and instructors, as well as the implementation of quality measures. The most notable such measure is the Online Faculty Mentor (OFM) program, which combines peer mentoring and review of new course design and subsequent redesign (for an overview, see Adams, Rust, & Brinthaupt, 2011). This program was implemented in 2005, and a current pool of 20 OFMs review between 70 and 90 courses each year. In addition to helping maintain and improve online course quality, the OFM program created a pool of committed online instructors who could be utilized when the institution decided to participate in a quality endorsement effort. Whereas the OFM Program is faculty-driven, administrative oversight and faculty involvement provide valuable input and resources in the development of MTSU online courses and programs (Tannehill, Serapiglia, & Guiler, 2018).

Competition for online student enrollments is strong among public and private institutions (e.g., Deming, Goldin, Katz, & Yuchtman, 2015; Kolowich, 2009). One phenomenon that reflects this competition is the existence of annual rankings and certification tools for online programming. There are several well-established, exemplary online program designation options available to higher education institutions. These include the TheBestSchools.com – 100 Best Online Colleges annual report (http://www.thebestschools.org/rankings/best-online-colleges/#methodology), the Online Learning Consortium (OLC) – Excellence in the Administration of Online Programs (https://onlinelearningconsortium.org/consult/quality-scorecard/), the US News and World Report – Best Online Colleges annual report, and the Quality Matters Online Program Certification (https://www.qualitymatters.org/qm-reviews-certifications/program-reviews). Many institutions also have their own processes for identifying exemplary online courses, as do learning management systems (e.g., Blackboard’s Exemplary Course Program – http://www.blackboard.com/consulting-training/training-technical-services/exemplary-course-program.aspx). See Baldwin and Trespalacios (2012) for a more comprehensive listing and evaluation of several instruments for online course design.

Although reviewers have identified many effective practices for online education (e.g., Brinthaupt, Fisher, Gardner, Raffo, & Woodard, 2011; Moore, 2012), very little has been published about how institutions can utilize these best practice and their various certification opportunities to improve their online programs. A few reviewers have documented programs that reflect quality benchmarks. For example, Truman (2004) described how the University of Central Florida created the faculty support infrastructure that helped that institution receive the 2003 Sloan-C Excellence in Online Teaching and Learning Award for Faculty Development. Similarly, Varvel, Lindeman, and Stovall (2003) discussed how the Illinois Online Network created an exemplary state-wide faculty development program for its online programming. Mast and Gambescia (2015) provide an overview of exemplary online programming and accreditation in healthcare management. However, these papers do not discuss how they benefitted from applying for an award designation, such as receiving feedback as part of that process and ways that the programs changed as a result of that feedback.

As Keil and Brown (2014) noted, the U.S. Department of Education began requiring that accrediting agencies evaluate institutions’ distance education programming in 2010. In their review, they found that distance education policy standards used by accrediting organizations include institutional context and commitment (e.g., the relationship of online education to an institution’s mission), curriculum and instruction (e.g., approval, evaluation, and oversight processes for online courses and programs), and faculty and faculty support (e.g., selection, training, and evaluation of online teachers). Other standards include student support services (e.g., online learning orientation, tech support, access to campus resources), evaluation and assessment (e.g., student learning outcomes, plans for expanding offerings while maintaining quality), and student identity issues (e.g., verification, cheating). Friedman (2016) notes that accreditors typically evaluate online and face-to-face programs similarly.

Table 1 provides the methodology categories used by the major exemplary online program options. As the table indicates, there are common criteria for the various certification programs and best online college rankings. Most options include indicators of course quality, student support, and faculty quality and training. However, the degree of thoroughness of the categories and their alignment with the standards of accrediting agencies varies widely across these programs.

MTSU was aware of the different online program certifications used to determine the status of online programming and administration. The OLC Quality Scorecard (QS) for the Administration of Online Programs certification was chosen because it seemed to be the best developed and documented. The program was developed in 2010 and updated in 2014 (Shelton & Pedersen, 2015). MTSU was also an OLC institutional member which permitted access to supporting documentation for the program. The OLC criteria also parallel the standards used by accrediting agencies for online education. Finally, the Quality Matters program is the only other option from the list that offers a formal application process and provides detailed feedback about the status of an institution’s online programming.

There are several reasons why institutions might pursue certifications or endorsements for their online programs. These include to (1) determine the current status of one’s program, (2) improve one’s program (by matching to established benchmarks and standards), (3) obtain public recognition for its success (to aid in promotion of the program), and (4) provide additional evidence of quality for accreditation efforts. Completion of an endorsement application presented MTSU Online with the opportunity to work with colleagues across campus who could influence and promote changes in policy and process for program improvement.

In this paper, we describe our experience with utilizing the OLC QS and submitting our materials for review. The process was effort- and time-intensive, taking over two years from the initial meeting to receiving the final review score. The case study illustrates that institutions can maximally benefit from such an activity through persistence, hard work, and a well-organized approach.

Utilizing the Quality Scorecard

The OLC Quality Scorecard for Administration of Online Programs (see https://onlinelearningconsortium.org/consult/olc-quality-scorecard-administration-online-programs/) requires the completion of a rubric for each of nine categories (see Table 1). Each category includes a varied number of indicators, with the total scorecard consisting of 75 quality indicators. Each indicator receives from 0-3 points based on OLC reviewer evaluations of the information and artifacts uploaded to the scorecard website. OLC recruits a team of expert reviewers who are paid for their work. Earning three points means that the quality standard is exemplary and fully implemented; two points shows that the standard is accomplished, but some improvement is still needed; one point denotes that support for the standard is developing, but much improvement is still needed; zero points indicates deficient and no support for the standard. For more information about the scorecard and its development, see https://onlinelearningconsortium.org/about/quality-scorecard-administration-online-learning-programs-higher-education/ as well as Moore and Shelton (2013).

The QS fee structure is based on an institution’s student full-time equivalents (FTEs) as well as its OLC membership status. For MTSU, the fee for completing the QS was $7500. This fee provides the institution with access to the review website in order to post supporting documentation and artifacts. It also covers the cost of the time put in by the reviewers. Institutions conduct a self-evaluation and, once the scorecard categories have been addressed, submit the self-evaluation to OLC for a preliminary review and score. Institutions then have an opportunity to make changes and resubmit for the final review and score.

With the UC Associate Dean’s approval and funding commitment, MTSU’s scorecard self-evaluation process began with the establishment of a 10-person QS committee (see Table 2 for this committee’s organizational structure). A “course” shell was established in the MTSU learning management system for the collection of scorecard artifacts. In January 2015, the chair emailed prospective members of the committee requesting their service. Because everyone on the invitation list contributed in some capacity to MTSU Online, the value of this process was understood, and all invitees readily agreed to serve. Establishing this core committee ensured buy-in and commitment to the process as we identified desired or required program needs and changes. Early in the evaluation process, the committee chair also participated in a 7-day QS workshop offered by OLC and reported on that experience to the committee.

Because institutions have approximately six months to submit their materials after purchasing the QS, the strategy was to first determine the institution’s standing on the categories and criteria. The plan was to work through all rubrics, complete the self-evaluation, and identify needed improvements and then decide whether the results of a formal review would be successful. The QS provides different levels of accomplishment for an institution based on total points earned: unacceptable, inadequate, marginal, acceptable, and exemplary. If a self-evaluation indicated that MTSU was close to the point cut-off for one of the higher levels of accomplishment, a move to purchase the QS access and begin organizing and submitting its supporting documentation for formal review would be in order. Even if the self-review yielded a lower level of accomplishment, MTSU still planned to purchase QS access in order to use the process to improve its online programming.

The Review Process

Committee Meetings

The first meeting occurred approximately one month after the committee was established. We uploaded the QS Handbook (which includes a description of the nine rubrics) into this shell for review by committee members. Topics discussed at the initial meeting included the purpose of the QS and the potential for program improvement and accreditation compliance. The committee discussed the QS Handbook and rubrics and confirmed rubric assignments. We scheduled monthly meetings at which presentations on each rubric would be made.

The committee met monthly for the next seven months. Two or three presentations were originally scheduled for each two-hour meeting. However, because of the level of detail and discussion required, it was determined after the initial meeting that one-hour meetings focusing on a single scorecard category would be necessary. At these meetings, committee members made presentations on categories and indicators. Indicator rankings were frequently revised based on group questions, discussion, interpretation, and input. After each meeting, the committee made changes to identified areas when possible and/or forwarded requests for improvements to the appropriate campus personnel or unit.

The committee chair took extensive notes at each meeting and shared these afterwards with the other members. When questions arose that required QS contact person assistance, the committee chair submitted those questions and reported the responses to the committee. The following paragraphs describe some of these meetings and provide examples of the ways that we used the review to assess and improve our online programming.

The initial QS self-evaluation meeting addressed the Course Structure category. The committee allocated 15 out of 24 possible points to the various indicators within this category. During the evaluation, five issues for improvement were identified: (1) developing a university-wide process to tie learning outcomes to materials created by new course designers; (2) training on the course syllabus template; (3) ensuring faculty response time information was clear on the syllabus template; (4) creating consistency in online course design and use of the LMS course template; and (5) ensuring that faculty are aware of and have access to best practices and processes. To address these issues, new content or materials were created, supportive campus practices or policies located outside of MTSU Online were identified, or existing resources to meet the QS standards were reorganized.

At the next meeting, the committee reviewed the Course Development and Instructional Design category rubric. Initially 12 of 36 possible points. However, during discussion, the point value was revised to 25. Areas that did not earn the maximum three points included: (1) course design to assist students to meet learning outcomes; (2) copyright and fair use training for faculty; (3) course designer training for student-centered design; (4) assessing the LMS for usability; (5) consistency in course development for student retention; and (6) evaluating the effectiveness of current and emerging technologies. Once again, some of these missing items existed in locations or activities not directly tied to MTSU Online (such as the library or faculty technology center), and it was not possible to position them for more effective presentations within the online program.

At the Faculty Support category meeting, the review resulted in a score of 13 out of 18 possible points. We identified the need for end-of-semester surveys as well as tutorials related to technical assistance to improve the score. Later meetings indicated that the Institutional Support category was strong at our campus (23 out of 27 possible points) as were the Technology Support category (19 out of 21 possible points) and the Student Support category (43 out of 48 possible points.

Scorecard Reviews and Results

Following the seven months of meetings, other campus priorities consumed the work of the committee for the next six months. However, we had determined that our program reached the accomplished or exemplary levels of many of the scorecard category indicators. Thus, we decided to purchase the QS access and begin organizing and submitting our supporting documentation for formal review. This work occurred during the summer and fall academic terms of the process’s second year (AY 2015-2016). The submission for the preliminary review occurred in December (2016). The results of this review were received approximately one month later. MTSU Online earned 161, which placed MTSU in the marginal category (70-79% of total possible points). However, based on the feedback received, we were confident that we could improve our score, at least into the acceptable category (80-89% of total possible points). In fact, based on experiences with the process to date, it was highly possible that MTSU could meet the exemplary endorsement category, which required at least 202 points (90% or higher of total possible points). The committee met shortly thereafter to examine the results of the review, with the knowledge that a mid-March (8-week) deadline was required for the final review.

The OLC Review Report included the following sections: an executive summary (which stated that MTSU’s overall score may be revised with additional explanation of submitted artifacts); a list of the rubric categories comparing the scores of the institution’s self-assessment to those assigned by the OLC; the nine rubrics, including all indicators, and the OLC scores; and an extensive explanation of why each indicator in each category received the assigned scores from the reviewers. The last section was especially detailed and recommended the artifacts and information needed to earn the maximum number of points in the indicators that were not fully met.

With the revision of the scorecard materials, we realized that we might actually be able to find the 41 points needed for the exemplary status. To work toward this number, we first addressed indicators receiving 0 or 1 point in the review. The committee chair created a chart that explained the earned points in the various categories and indicators and the areas for improvement.

In order to reach the accomplished (2) or exemplary (3) point levels for the lower-scoring indicators, it was necessary to identify or create resources for those indicators and then provide justification on how they now met the scorecard criteriaas the part of the endorsement process where our campus made its most significant improvements to the organization and support of our online programming. During this time: new and better ways to consolidate and present online resources to students (e.g., updating tutorials, research guides, and training) were explored; gaps in online faculty training and development areas (e.g., offering workshops on emerging technologies) were filled; and general practices and policies for both students and faculty (e.g., creating a new student-centered MTSU Online website, revising the online course syllabus template, and establishing specific assignment feedback deadlines) were clarified.

At the end of the 8-week revision process, the new scorecard was submitted with supporting artifacts and resources. This resubmission included information about the areas where we demonstrated support for additional points. These improvements included creating recruitment brochures for online students and programming, formalizing the issue and complaint process for online students, and adding links to supporting materials in the LMS and program website. Within a month, MTSU Online received the second Scorecard results. MTSU’s total points were 197, which qualified it for “full endorsement” status for a 2-year period. Because this score was so close to the required 202 points for the exemplary endorsement, a review of the second scorecard summary found two calculation errors, which gave MTSU 200 points (only two points from exemplary status). The OLC graciously agreed to give MTSU 30 days to find the additional two points needed for this top status. During that time period, the committee reviewed the remaining areas where improvement could be demonstrated and was able to document an additional two points. Approximately one month later, MTSU Online received notification that its program had achieved the OLC exemplary endorsement, with a final score of 202 (90%).

Program Improvements and Lessons Learned

The most valuable lesson learned from participating in the OLC endorsement process was that MTSU had already done much to create an exemplary online program. We thought that our online programming was strong prior to using the QS. However, focusing on the scorecard categories and indicators identified several areas for improvement by providing MTSU a structured and detailed look at how its online program was organized and promoted. Although MTSU Online is a centralized program, resources for online teaching and learning were scattered across campus divisions and websites. Also, some of the training, development, and planning resources used by the program were spread across different units. The QS process provided a template to not only “pull everything together” but also to identify and fill the gaps that were uncovered. The process provided direction toward greater emphasis on continuous and strategic improvement, which has proven especially beneficial for online educational programming (Simunich, 2015). Through the evidence provided by the QS review and feedback, buy-in for program improvement from administration, faculty, and department chairs was enhanced.

After the final submission of the QS materials, the flexibility of OLC reviewers was appreciated. Whereas the focus was on obtaining a few more scorecard points to reach the exemplary status, using the QS categories and indicators “nudged” MTSU toward additional improvements in its online programming in order to meet the desired standards and benchmarks. Table 3 presents a list of the major program improvements that resulted from the review process. Areas that support MTSU Online work more cohesively now than before the review process. In addition, an Online Program Quality Coordinator to facilitate and improve MTSU’s OFM Program is being hired. The person in this position will be charged to develop and implement an online instructor and developer certificate program, manage our 3-year online course redesign program, and work with campus legal counsel on online course third party vendor issues (e.g., to ensure protection of student educational records when faculty require completion of course materials outside our LMS).

Some other initiatives that emerged out of the QS process include the development of a common survey for online instructors to assess student satisfaction with the online environment and further organization of online instructional resources. MTSU has also recently hired two accessibility specialists who work with online (and other) faculty to improve course and campus content and information. The new online teaching certificate program will replace an external program used to qualify faculty to develop and teach online courses. MTSU will also develop an online course for new OFMs that will complement the current process of working with a seasoned OFM.

Table 4 lists the major lessons learned from our application process that other institutions might find useful. First, those who are considering the purchase of QS should register for the scorecard workshop. In addition to learning about the QS categories and indicators during this workshop, the self-evaluation component helps determine the status of one’s online program and provides an opportunity to make improvements and corrections before purchasing the QS.

Second, a course shell in our LMS was established in order to collect artifacts and facilitate the final submission of these for the endorsement. As it turns out, this did not work as well as we had hoped. Only three committee members uploaded their artifacts, and when it was time to upload all supporting materials to the scorecard, the committee chair ended up having to follow up with committee members to complete the upload. However, there was a positive side to this problem. A great deal was learned about various institutional supports for online programming and identified other issues that had not been discussed during the committee meetings and presentations.

Third, the review process allowed MTSU to be strategic in its response to gaps in our online programming. A “big picture” view of the status of our program was obtained. Then, with the help of the benchmarks and standards from the QS, areas for improvement were identified. Making these improvements would have been difficult or impossible without the comprehensive, structured review of this program. The exercise involved both administrative and faculty input and feedback—all institutions need to be cognizant of the balance of these sources (e.g., Tannehill et al., 2018).

Fourth, if an institution moves forward with the application for a formal online quality designation, we recommend a committee structure similar to the one we used. If a committee is used, institutions should encourage members to offer assistance on categories, to which they are not assigned, expertise permitting in those areas. If such a committee is not feasible, it is still imperative that there is buy-in from those invested in online education and who have expertise and knowledge of the various categories and indicators. Institutions will also benefit by seeking information from administration, faculty and staff who are not involved in the process. In our experience, the campus community was happy to help with this type of endeavor and appreciated the value that a successful outcome would bring to the university.

Finally, as noted above, the OFM program has served as the linchpin for MTSU’s distance education programming. Although this is primarily a faculty, peer-mentoring and review program, there are other approaches to online faculty development that institutions can consider. For example, Dittmar and McCracken (2012) described a comprehensive, decentralized model that includes customized mentoring, continuous professional engagement, technology integration, and multifaceted assessments. Creation of a program similar to one of these models would be very beneficial as institutions strive to achieve an exemplary program designation.

Conclusion

At a time when online courses and programs are proliferating internationally, competition for an institution’s online students is intense. Obtaining an Exemplary Program endorsement is a way for institutions to efficiently and effectively review its program and have its program peer reviewed. The process of review and the endorsement enhance the program and position it to compete more effectively with other online course and program options that are available to students. As MTSU Online celebrates 21 years, the OLC endorsement provides a tangible and well-respected standard of excellence that can be used to promote the quality of the university’s online program.

As this case study demonstrates, the process of obtaining the Exemplary Administration of Online Programs endorsement was not easy. Many faculty, staff, and administrators worked hard to ensure that our institution met the criteria. Our institution also benefitted from a long history of faculty buy-in and administrative support for online teaching and programs. Other institutions are encouraged to consider utilizing the Quality Scorecard (or a similar certification program), primarily because of the ways that the process can help an online program to continue improving its quality and support.

 

Acknowledgement

The authors wish to than Barbara Draude, Faye Johnson, and Dianna Rust for their comments and suggestions on an earlier version of this manuscript. Readers who are interested in more specific information about the process and feedback may contact the authors.

 

Learn more about OLC's Quality Scorecard Suite.


References

Adams, C. L., Rust, D. Z., & Brinthaupt, T. M. (2011). Evolution of a peer review and evaluation program for online course development. In J. E. Miller & J. E. Groccia (Eds.), To improve the academy: Resources for faculty, instructional, and organizational development (Vol. 29, pp. 173-186), San Francisco, CA: John Wiley & Sons.

Baldwin, S. J. and Trespalacios, J. (2012). Evaluation instruments and good practices in online education. Online Learning, 21(2). doi: 10.24059/olj.v21i2.913

Brinthaupt, T. M., Fisher, L. S., Gardner, J. G., Raffo, D. M., & Woodard, J. B. (2011). What the best online teachers should do. Journal of Online Learning and Teaching, 7, 515-524. http://jolt.merlot.org/vol7no4/brinthaupt_1211.htm

Deming, D. J., Goldin, C., Katz, L. F., & Yuchtman, N. (2015). Can online learning bend the higher education cost curve? The American Economic Review, 105(5), 496-501. doi:10.1257/aer.p20151024

Dittmar, E., & McCracken, H. (2012). Promoting continuous quality improvement in online education: The META model. Journal of Asynchronous Learning Networks, 16(2), 163-176.

Friedman, J. (November 11, 2016). Accreditation of online degree programs: Frequently asked questions - Accrediting agencies hold online programs to the same standards of quality as those on campus, experts say. US News and World Report. Retrieved from https://www.usnews.com/education/online-education/articles/2016-11-11/accreditation-of-online-degree-programs-frequently-asked-questions

Keil, S., & Brown, A. (2014). Distance education policy standards: A review of current regional and national accrediting organizations in the United States. Online Journal of Distance Learning Administration, 17(3). Retrieved from http://www.westga.edu/~distance/ojdla/fall173/keil_brown173.html

Kolowich, S. (2009). Recession may drive more adult students to take online courses. The Chronicle of Higher Education. Retrieved from https://www.chronicle.com/article/Recession-May-Drive-More-Adult/21971

Mast, L. J., & Gambescia, S. F. (2015). Assessing online education and accreditation for healthcare management programs. Journal of Health Administration Education, 32(4), 427-467.

Moore, J. C. (2012). A synthesis of Sloan-C effective practices, December 2011. Journal of Asynchronous Learning Networks, 16(1), 91-115.

Moore, J. C., & Shelton, K. (2013). Social and student engagement and support: The Sloan-C Quality Scorecard for the Administration of Online Programs. Journal of Asynchronous Learning Networks, 17(1), 53-72.

Rust, D. Z., Brinthaupt, T. M., & Adams, C. (2017). Using technology to enhance student and faculty success in online courses. In S. Mukerji & P. Tripathi (Eds.), Handbook of research on technology-centric strategies for higher education administration (pp. 195- 209). Hershey, PA: IGI Global.

Shelton, K. & Pedersen, K. (2015). Benchmarking quality in online learning programs in higher education. In Proceedings of Global Learn Berlin 2015: Global Conference on Learning and Technology (pp. 280-295). Berlin, Germany: Association for the Advancement of Computing in Education (AACE).

Simunich, B. (2015). Speaking personally—With Ron Legon. American Journal of Distance Education, 29(3), 220-226. http://dx.doi.org/10.1080/08923647.2015.1059624

Tannehill, D. B., Serapiglia, C. P., & Guiler, J. (2018). Administrative or faculty control of online course development and teaching: A comparison of three institutions. Information Systems Education Journal, 16(3), 26-34.

Truman, B. E. (2004). UCF’s exemplary faculty support: An institutionalized ecosystem. Journal of Asynchronous Learning Networks, 8(3), 89-96.

Varvel, V. E., Lindeman, M., & Stovall, I. K. (2003). The Illinois online network is making the virtual classroom a reality: Study of an exemplary faculty development program. Journal of Asynchronous Learning Networks, 7(2), 81-95.

Online Journal of Distance Learning Administration, Volume XXII, Number 2, Summer 2019
University of West Georgia, Distance Education Center
Back to the Online Journal of Distance Learning Administration Contents

SOURCE: University of West Georgia