What Students Want: Taking Instructional Design in Graduate Online Courses Beyond Best Practices, Principles, and Standards.
Concurrent Session 1
When our team started developing an online course template for a graduate professional program, we discovered that evidence-based principles, published standards, and best-practices were not enough: students wanted more. We set out to discover new ways to make our course design even more useable & more useful.
The gist of it: When our team started developing an online course template for a graduate professional program, we observed and talked to students, informally following general principles of an approach known as “design thinking”: structured combination of observation and interviewing, that pays attention to what people do and how they do it, and asks for clarification of why, especially when the behavior seems surprising or illogical to the observers (with the understanding that it’s usually the observers/interviewers who are missing an important point of the process, which is otherwise logical or obvious to end-users being interviewed). This phase is then usually followed by the design team going through the analysis and interpretation of such interactions and feedback, and implementing changes that remove “friction points” for end-users (in our case, students or faculty).
In the process, we discovered that many (1) best practices, (2) evidence-based principles, and (3) published standards (I will refer to all of them, generally as “guidelines”) - many of which our team strove to promote and implement as much as feasible in our online courses, were not enough: students wanted more. More importantly, we discovered that sometimes features of online courses that seemed important to us, and were carefully implemented -often through significant work effort - to comply with 1-3 above, were perceived as unimportant by students. Infrequently, but surprisingly, some of these elements were even perceived as “getting in the way” of learning, and we’re not appreciated. Logically, we set out to discover new, sometimes simple, and sometimes surprising ways which - as students were reassuring us - would make our online courses more useable and more useful. This presentation is an overview of this journey, and it focuses on presenting practical, useable, and easily-implemented elements of online courses which - while hardly-mentioned among 1-3 above - seem to improve (subjective) student experience in online courses.
This presentation begins by providing a brief overview of our understanding of best practices, principles, and standards, especially in how they relate to Kirkpatrick’s levels of performance evaluation. We look at typical, popular examples of these types of guidelines, and explore their general merits. Next, we offer examples of guidelines which we implemented, and which our students found - surprisingly for us - not to be useful. We explore the specific examples, our solutions as well as necessary compromises, and the reasons for this apparent contradiction. In the process, we stress the importance of taking into account local contextual factors that affect the specific nature of each course or program of study, and help decide which elements (1-3) should be implemented as a part of design a specific course or program.
We then proceed to explore “guidelines” that are not often discussed or mentioned (and even less often implemented in actual courses), but which our students perceive to be useful or even essential. Often, these are small technical elements of course presentation, such as using consistent file names (that include specific information included in the naming scheme) for files that students download during the course; or providing page count (and estimated reading time) for reading assignments, and run time for multimedia materials students need to listen to or watch. At other times, these include realization of how student perceptions are positively affected by shifting from normative-administrative tone/type of words used to describe course materials or assignments, to informative/explicative tone (for example, positive perception of changing labels from “required” to “essential (for basic understanding of content).” We also discuss the third type of student-requested “guidelines” that focus characteristics of course study materials and specific features of electronic files that we call “functional,” that include the degree to which content can be manipulated or interacted with (for example, the ability to have a convenient way to perform a “YouTube-style, 10-second rewind” for videos, or the ability to view/listen to multimedia at viewer/listener-selected speed, or the ability to highlight and annotate reading materials in the course). We conclude this exploration by discussing a student survey-based chart with relative measures of usefulness of each of these “new guidelines,” as perceived by students, with the intent of providing the audience with a list (ranked by a degree of student preference) of design elements that can be implemented in their own practice.
Finally we propose that the best way to understand, evaluate, and - in the future - possibly anticipate some of these “surprise” design elements that are absent from most guidelines, but are perceived by students as important, would be by expanding the traditional understanding of graduate online courses in terms of “andragogy” to understanding them also in terms of “heutagogy”: a term that roughly corresponds to the concept of self-determined learning, as proposed and understood by Hase and Kenyon in their 2013 “Self-determined Learning: Heutagogy in Action” (Bloomsbury), and that derives from re-configuration of “concepts such as constructivism, capability, andragogy and complexity theory” supported by neuroscience research.
The presentation will include several interactive elements (using a student-response software Socrative) that will offer the audience the opportunity to test their familiarity with existing, and popular “best practices” and consortium standards for online courses; another activity will ask participants to anticipate which of the “student-generated” guidelines for online courses were ranked high or low, and then to compare their personal assumptions with the actual numeric/chart results based on student survey.