Moving from Lecture Capture to Blended: Evaluating the Impact on Students and Faculty

Concurrent Session 9

Brief Abstract

This session will report results of the evaluation of a major effort to redesign the University of Central Florida’s core College of Business Administration courses from lecture capture to a blended, active learning format. The impact on student, faculty and the academy will be discussed.

Presenters

Patsy Moskal is the Director of the Digital Learning Impact Evaluation in the Research Initiative for Teaching Effectiveness at the University of Central Florida (UCF) where she evaluates the impact of technology-enhanced learning and serves as the liaison for faculty scholarship of teaching and learning. In 2011 Dr. Moskal was named an OLC Fellow in recognition of her groundbreaking work in the assessment of the impact and efficacy of online and blended learning. She has written and co-authored numerous works on blended and online learning and is a frequent presenter on these topics. Patsy's co-authored book--Conducting Research in Online and Blended Learning: New Pedagogical Frontiers--with Dziuban, Picciano, and Graham, was published in August 2015. She currently serves on the OLC Board of Directors.

Extended Abstract

Background

The College of Business Administration (CBA) at the University of Central Florida (UCF) is conducting ongoing research to examine the effectiveness of a new course format called Reduced Seat Time, Active Learning (RA). This new blended learning format employs online learning as the primary instructional mode, but with 20% of the instructional time within the semester being delivered in a live, active-learning format.

UCF’s CBA is one of the largest in the country with over 8,000 students and select core courses required of all students. To accommodate this enrollment size, the college has relied on lecture capture modality for course delivery. This online instructional method uses video streaming to provide lectures to students, although the CBA recorded the live lectures, allowing students to attend the face-to-face sessions, if desired. However, faculty found that no more than 10% attended at any given time, typically on the first day of classes or before difficult exams. The vast majority of students watched the recorded lectures online at a later time.

Lecture capture provided little opportunity for interaction among students or between students and faculty and made active learning group work challenging. The RA format will replace the lecture capture format, allowing students more opportunities for interaction with each other and with the instructor, as well as the potential for applied, active learning activities completed in the face-to-face sessions. The limitation of 5 face-to-face sessions per semester allows for the large scale of the college, given the available classroom space on campus.

Focus of the Present Research

This novel instructional change was supported by a number of university units, including the Center for Distributed Learning, which supports the online and blended learning modalities, and the Faculty Center for Teaching and Learning, which provides professional development support and guidance for active learning.

UCF has a long history of evaluating the impact of instruction on students, faculty, and the academy with the Research Initiative for Teaching Effectiveness (RITE) historical evaluation work on online, blended, and adaptive learning (Moskal, P, 2017; Dziuban, Howlin, Johnson & Moskal, 2017; Dziuban, Moskal, Cavanagh & Watts, 2012; Moskal et al. 2008). RITE is supporting the evaluation of the RA modality along with CBA faculty and administrative input.

Impact of Course Redesign Effort on Students

In addition to comparing student success (A, B, C grades) and withdrawal rates with past semesters for the 8 redesigned courses, a student survey was developed in spring 2018 to measure students’ use of, and reactions to course videos and online components, and the face-to-face, active learning class sessions. Likert items assessed students’ frequency of use and perceived usefulness for various instructional resources utilized in the online portion of their class (e.g., textbooks, adaptive learning, videos, quizzes, games, simulation, etc.). Videos were rated according to perceived quality, ability to stimulate interest, length, and usefulness. In addition, students were asked about their interaction with the course elements to capture their perceived interaction and experiences with online components as well as group work conducted in face-to-face sessions. Demographic information was collected that provides additional context for variables that might impact students’ course engagement, including travel time and employment. In addition to Likert items, students were asked what they liked most, liked least, and what changes they would recommend to improve the course regarding videos, online components, and active learning sessions. These open-ended responses will be content analyzed and aggregated into common themes.

Impact of Course Redesign Effort on Faculty

A similar survey was developed to capture faculty reactions to the new course format with the intent of measuring experiences and variables that might be useful to future faculty who will be redesigning their course materials. Faculty were also asked to provide feedback regarding the support they received and support they feel is needed to provide a quality course experience to their students in the new format. They were also asked to provide specifics regarding their use of group active learning activities, and encouraged to provide details regarding their perceived successes and challenges toward accomplishing a meaningful face-to-face class session. Open-ended questions again captured their feedback on what worked well, what did not work as well and any changes they will make to future course iterations. These open-ended responses will be content analyzed and aggregated into common themes, and provided to faculty currently scheduled to redesign future courses.

The student and faculty surveys were developed using Qualtrics survey software (www.qualtrics.com). Faculty were encouraged to give extra credit in their courses for students to participate in the survey, but they were not required to do so, and one course instructor did not give extra credit. In spring 2018, eight faculty taught courses in the RA modality with 6,338 students. Overall, 3,232 survey student responses were received, yielding a respectable response rate of 51%.  Seven of the 8 faculty completed the faculty survey, with a response rate of 88%.

In addition to the initiative-specific surveys, UCF administers a Student Perception of Instruction (SPI) form each semester in an online format for all courses independent of modality. These student ratings are high stakes as they are used by administrators in faculty promotion, tenure and awards. These SPI data are being examined for the impact the course redesign may have had in students’ ratings for these courses. CBA core courses are historically taught by the same faculty each semester. Therefore, prior semesters will be compared with the current iteration to determine if any gains or losses were experienced in faculty student ratings.

Pending Results

Survey analysis is ongoing at the time of this submission, with results to be available at the end of summer 2018, in time for the Accelerate conference time frame. We are awaiting student success and withdrawal rates from institutional records and student perception of instruction data from computer services. Preliminary survey results indicate that 67% of students rated the courses as good or excellent. Students and faculty provided detailed feedback, creating a challenge for analyses, but rich perception data.

Benefit to OLC attendees

Major changes in a college can be challenging for students and faculty. Going from a fully online, lecture capture modality to a blended, active learning format was a major initiative for UCF’s College of Business Administration. We have found that ongoing evaluation can provide rich data to help improve the experience for students and faculty. Critical to this evaluation is providing the results of this research to administrators, support units, and faculty so that issues can be addressed to help inform future iterations (Moskal, Dziuban & Hartman, 2013). We expect rich data from our large scale evaluation analyses that can provide valuable information for those considering or embarking on similar blended learning course redesign efforts.

References

Dziuban, C., Howlin, C., Johnson, C., & Moskal, P. (2017, December 18). An Adaptive Learning Partnership. EDUCAUSE Review.

Dziuban, C., Moskal, P., Cavanagh, T., & Watts, A. (2012). Analytics that inform the university: Using data you already have. Journal of Asynchronous Learning Networks, 16(3), 21-38.       

Moskal, P. (2017). Evaluating the outcomes and impact of hybrid courses. In K. E. Linder, & D. L. Fontaine. (Eds.), New directions for teaching and learning: Hybrid teaching and learning. San Francisco, CA: Jossey-Bass.

Moskal, P., Dziuban, C.D., Upchurch, R., Hartman, J., Truman, B. (2006). Assessing Online Learning: What One University Learned about Student Success, Persistence, and Satisfaction. Peer Review: Emerging trends and key debates in undergraduate education: Learning & Technology, V 8(4).

Moskal, P., Dziuban, C., & Hartman, J. (2013). Blended learning: A dangerous idea? Internet and Higher Education, 18, 15-23.