Student's Expectations vs. Experiences of Course Design in Online Writing Courses

Concurrent Session 8
Streamed Session Research

Watch This Session

Session Materials

Brief Abstract

Understanding student expectations and experiences of course design in online courses is an important aspect of the assessment and interventions of an online program. This session will present results from an online writing program assessment conducted in the fall of 2018 followed by a discussion of actionable reactions to the data collected.

Presenters

Catrina Mitchum is a Lecturer in the Department of English at the University of Arizona. She earned her PhD in Composition/Rhetoric and Digital Studies from Old Dominion University. In 2018, she was, collaboratively, awarded the CCCC Research Initiative Grant. Her research interests are in retention and course design of online writing classes. She has scholarly work published in The Journal of Teaching and Learning with Technology, MediaCommons and Enculturation. She teaches first year writing and professional and technical writing courses online.

Extended Abstract

In the fall semester of 2018, the writing program at the University of Arizona had officially been online for 3 years. This program uses a flexible pre-designed course system (the pre-designed course is required the first time teaching online, and then can be adapted and adopted as needed). In order to assess the impact of the program, it was decided, by the Online Writing Program Administrator (OWPA), that it would be useful to survey students in order to determine their backgrounds and expectations and compare them to how they experienced the online writing environment. Student expectations can impact student motivation, attitude, and behaviors (Campbell & Mislevy, 2012; Roberts & Styron, 2006; Bean & Metzner, 1985; Ames & Archer, 1988), which can influence overall student retention (Plietz, et al, 2015; Bean & Metzner, 1985; Friedman & Mandel, 2011; Moore et al, 2003). 

 

These surveys were based on a multi-institutional research project funded by the Conference on College Composition and Communication. All of the surveys were built into the pre-designed courses (PDCs) as a required part of the course for a small number of homework points. The initial surveys asked students what they expected from the online classes. For example, questions asked about interaction expectations with the instructor and classmates, expectations regarding the pace and due dates, and expected course elements. The surveys also asked students about their backgrounds. For example, questions asked about computer skills, current work situations, and linguistic background information. The initial surveys also asked for certain demographic information such as first-generation college student status, gender, current credit hours, classification, and ethnicity. In an attempt to track student experiences, the end of term surveys asked students questions in similar categories such as how much and type of interaction took place, computer skills that were required, etc. Final grades (including withdrawals) were also collected. 

 

In order to provide immediate feedback to the students, some questions were followed by “Tips for Success.” These tips ranged anywhere from “This isn’t a self-paced course, please see your D2L course for a schedule” to a detailed list of commonly used online terminology. Some of the descriptive aggregate results were  sent to instructors of those courses with suggestions for how to act on the information provided.

 

In this presentation, we’ll discuss the results and analysis from questions that asked the students expectations about course design and engagement in an online environment. These assessment results will contribute to scholarship about students preferences for different course elements. For example, Ausburn (2004) surveyed 67 adult blended learners and found “course announcements and reminders from instructor, course information documents, information about assignments and instructions for completing them, and course instructional/content documents and materials (handouts, PowerPoint slides, Internet site)” the most important features in an online course (p. 330). Similarly, Brinkerhoff and Koroghlanian (2007), found detailed instructions for setting up course technologies, face-to-face orientation meeting, telephone technical support, weekly synchronous meetings, and general Q&A area/function within the course (p. 388).

 

The presentation part of the panel will show the data analysis while simultaneously asking the audience to: 

  • consider what the results might suggest about the courses that are being assessed, 

  • reflect on how that information might be used,

  • consider how online programs might work to create a system of assessment and feedback,  

  • reflect on possible interventions or how this information can be used to improve the student experience, and 

  • ask questions.

During the five minute reflection and freewrite, the presenters will ask attendees to think about the types of course design expectations they think their students may have and how those assumptions compare and contrast with our data. Presenters will also prompt attendees to list the types of questions they might ask their own students about their course design expectations. Presenters will use Sli.do, allowing attendees to ask and upvote their favorite ideas, suggestions, and questions to be discussed during both the presentation as well as the the Q&A. Presenters will share copies of study protocols and instruments as well as a list of relevant references. 

 

References

Ames, C., & Archer, J. (1988). Achievement Goals in the Classroom: Students’ Learning Strategies and Motivation Processes. Journal of Educational Psychology, 80(3), 260–267.

Ausburn, L. J. (2004). Course design elements most valued by adult learners in blended educational environments: An American perspective. Educational Media International, 41(4), 237-337.

Bean, J. P., & Metzner, B. S. (1985). A Conceptual Model of Nontraditional Undergraduate Student Attrition. Review of Educational Research, 55(4), 485–540.

Brinkerhoff, J., & Koroghlanian, C. M. (2007). Online students’ expectations: Enhancing the fit between online students and course design. Journal of Educational Computing Research, 36(4), 383-393. 

Campbell, C. M., & Mislevy, J. L. (2012). Student Perceptions Matter: Early Signs of Undergraduate Student Retention/Attrition. Journal of College Student Retention: Research, Theory and Practice, 14(4), 467–493. https://doi.org/10.2190/CS.14.4.c

Friedman, B. A., & Mandel, R. G. (2011). Motivation Predictors of College Student Academic Performance and Retention. Journal of College Student Retention: Research, Theory and Practice, 13(1), 1–15. https://doi.org/10.2190/CS.13.1.a

Moore, K., Bartkovich, J., Fetzner, M., & Ison, S. (2003). Success in Cyberspace: Student Retention in Online Courses. Journal of Applied Research in the Community College, 10(2), 107–118.

Pleitz, J. D., MacDougall, A. E., Terry, R. A., Buckley, M. R., & Campbell, N. J. (2015). Great Expectations: Examining the Discrepancy Between Expectations and Experiences on College Student Retention. Journal of College Student Retention: Research, Theory & Practice, 17(1), 88–104. https://doi.org/10.1177/1521025115571252

Roberts, J., & Styron, R. (2006). Student satisfaction and persistence: factors vital to student retention. Research in Higher Education Journal, 6, 1–18.