What Students Want: Taking Instructional Design in Graduate Online Courses Beyond Best Practices, Principles, and Standards.

Concurrent Session 1

Add to My Schedule

Brief Abstract

When our team started developing an online course template for a graduate professional program, we discovered that evidence-based principles, published standards, and best-practices were not enough: students wanted more. We set out to discover new ways to make our course design even more useable & more useful.

Presenters

Jerzy "George" Jura is the Director of Academic Technology at the U of Wisconsin-Madison School of Nursing (SoN) where he leads a team of academic technology staff in helping faculty identify, explore, and implement technology-enhanced teaching & learning solutions that best support and complement the School's undergraduate and graduate programs. The driving force behind his work is the realization that even the most sophisticated technology is useless, if people who could benefit from it don't know how to use it effectively; consequently, his passion is making information about technology easier to find and understand, and making technology easier to use. His primary long-term interests include evidence-based instructional design and course development, the use of student-response systems to increase student engagement in large undergraduate courses, and helping faculty develop generative learning activities that promote understanding. His immediate present focus is on leading the UW-Madison School of Nursing academic technology team in the planning and design of courses for two new graduate (Doctor of Nursing Practice) programs to be launched in the fall of 2021. In the past, he has lead campus-wide technology training initiatives for students and University staff at U of Wisconsin-Whitewater, as its first Technology Advancement and Training Coordinator, including managing the UW-W TechQuest project, an innovative online technology training for incoming students. Before focusing entirely on instructional technology, George developed broad teaching expertise as a faculty member at Iowa State U. (Ames, IA), and Lawrence U. (Appleton, WI), where he taught numerous technology-enhanced courses on Spanish Mediaeval & Renaissance Literature; Spanish Art; Film and Narrative Theory; and Linguistics. He shared his ideas on teaching with technology with audiences at numerous forums, including presentations and workshops at several EDUCAUSE National, and Midwest-Regional Conferences; ELI Annual Meetings; National Forums on Active Learning Classrooms; Distance Learning Conferences (Madison, WI); and Modern Language Association annual conferences.

Extended Abstract

The gist of it: When our team started developing an online course template for a graduate professional program, we observed and talked to students, informally following general principles of an approach known as “design thinking”: structured combination of observation and interviewing, that pays attention to what people do and how they do it, and asks for clarification of why, especially when the behavior seems surprising or illogical to the observers (with the understanding that it’s usually the observers/interviewers who are missing an important point of the process, which is otherwise logical or obvious to end-users being interviewed). This phase is then usually followed by the design team going through the analysis and interpretation of such interactions and feedback, and implementing changes that remove “friction points” for end-users (in our case, students or faculty). 

In the process, we discovered that many (1) best practices, (2) evidence-based principles, and (3) published standards (I will refer to all of them, generally as “guidelines”) - many of which our team strove to promote and implement as much as feasible in our online courses, were not enough: students wanted more. More importantly, we discovered that sometimes features of online courses that seemed important to us, and were carefully implemented -often through significant work effort - to comply with 1-3 above,  were perceived as unimportant by students. Infrequently, but surprisingly, some of these elements were even perceived as “getting in the way” of learning, and we’re not appreciated. Logically, we set out to discover new, sometimes simple, and sometimes surprising ways which - as students were reassuring us - would make our online courses more useable and more useful. This presentation is an overview of this journey, and it focuses on presenting practical, useable, and easily-implemented elements of online courses which - while hardly-mentioned among 1-3 above - seem to improve (subjective) student experience in online courses. 

This presentation begins by providing a brief overview of our understanding of best practices, principles, and standards, especially in how they relate to Kirkpatrick’s levels of performance evaluation. We look at typical, popular examples of these types of guidelines, and explore their general merits.  Next, we offer examples of guidelines which we implemented, and which our students found - surprisingly for us - not to be useful. We explore the specific examples, our solutions as well as necessary compromises, and the reasons for this apparent contradiction. In the process, we stress the importance of taking into account local contextual factors that affect the specific nature of each course or program of study, and help decide which elements (1-3) should be implemented as a part of design a specific course or program.

We then proceed to explore “guidelines” that are not often discussed or mentioned (and even less often implemented in actual courses), but which our students perceive to be useful or even essential. Often, these are small technical elements of course presentation, such as using consistent file names (that include specific information included in the naming scheme) for files that students download during the course; or providing page count (and estimated reading time) for reading assignments, and run time for multimedia materials students need to listen to or watch. At other times, these include realization of how student perceptions are positively affected by shifting from normative-administrative tone/type of words used to describe course materials or assignments, to informative/explicative tone (for example, positive perception of changing labels from “required” to “essential (for basic understanding of content).” We also discuss the third type of student-requested “guidelines” that focus characteristics of course study materials and specific features of electronic files that we call “functional,” that include the degree to which content can be manipulated or interacted with (for example, the ability to have a convenient way to perform a “YouTube-style, 10-second rewind” for videos, or the ability to view/listen to multimedia at viewer/listener-selected speed, or the ability to highlight and annotate reading materials in the course). We conclude this exploration by discussing a student survey-based chart with relative measures of usefulness of each of these “new guidelines,” as perceived by students, with the intent of providing the audience with a list (ranked by a degree of student preference) of design elements that can be implemented in their own practice.  

Finally we propose that the best way to understand, evaluate, and - in the future - possibly anticipate some of these “surprise” design elements that are absent from most guidelines, but are perceived by students as important, would be by expanding the traditional understanding of graduate online courses in terms of “andragogy” to understanding them also in terms of “heutagogy”: a term that roughly corresponds to the concept of self-determined learning, as proposed and understood by Hase and Kenyon in their 2013 “Self-determined Learning: Heutagogy in Action” (Bloomsbury), and that derives from re-configuration of “concepts such as constructivism, capability, andragogy and complexity theory” supported by neuroscience research. 

The presentation will include several interactive elements (using a student-response software Socrative) that will offer the audience the opportunity to test their familiarity with existing, and popular “best practices” and consortium standards for online courses; another activity will ask participants to anticipate which of the “student-generated” guidelines for online courses were ranked high or low, and then to compare their personal assumptions with the actual numeric/chart results based on student survey.