Taking Matters into Our Own Hands: Instructional Design with Research in Mind

Concurrent Session 3
Research Equity and Inclusion

Brief Abstract

In this interactive session, we share how and why instructional designers must take educational research into their own hands. Working through a real-world case study, we engage participants to think about how to design, instrument, and evaluate a learning activity to drive evidence-informed changes that measurably improve a learning experience.

Presenters

Melissa Johnson wants to live in a world filled with accessible and engaging online experiences, beach sand that miraculously disappears when you get to your car, and cold brew coffee that's ready in 15 minutes, not 15 hours. As head instructional designer with more than 13 years experience, she's been fearless in conceptualizing and shaping content for a 21st century online audience. From 34 highly customized training modules for a huge-social-media-platform-that-cannot-be-named, to transformational learning experiences (yes, that's a thing) for the Gender Intelligence Group, The Ariel Group, and Great on the Job, Melissa takes online course content to the next level. When asked how she does it, she’ll tell you, 'I run fast with scissors.' When she's not looking for the next great thing in online learning, you can find her tirelessly practicing to be the next Vivian Maier, indulging in as much live music as she can, or shamelessly catching up on General Hospital.

Extended Abstract

As instructional designers, we spend copious amounts of time and energy trying to build effective learning experiences, but at the end of the day rarely do we meaningfully evaluate whether a learning experience is successful or use robust evidence to identify opportunities to improve. Important questions, such as “Are all students succeeding equitably?” and “Does student behavior match the assumptions presupposed by the learning design?” typically go unexplored and unanswered. And despite instructional designers’ commitment to popular design methodologies and increasing efforts to incorporate evidence-informed findings from the learning sciences, the truth is that each learning setting is so contextualized and unique that it’s impossible to know whether even the best designed learning experience is truly effective. To make matters worse, current approaches to educational research are too slow, too artificial, and too abstract to adequately guide the design of the real-world learning experiences that instructional designers are tasked with constructing.

What is an instructional designer to do?

Fortunately, the increasing availability of educational data, made possible by the widespread use of digital and online learning platforms, has enabled instructional designers to take research into their own hands. It is now possible for instructional designers to collect meaningful evidence of student learning and obtain powerful insights into student behavior, perceptions, and attitudes. Rather than simply designing learning experiences according to industry “best practices” and hoping for the best, we believe unleashing instructional designers to engage in applied education research is critical to improving the quality of online education.

If this sounds daunting, it is! There are many challenges: knowing what data is important to collect, what tools and resources exist to collect that data, and understanding how to analyze and translate data patterns into learning theory informed ideas for improvement. But if instructional designers don’t lead this work, who will?    

In this interactive session, you will join a learning engineering team to help design, instrument, and evaluate a real-world learning activity that is included in an actual online course. Led by an experienced instruction design manager with over a decade of experience and a senior learning data scientist, you will be pushed to step outside traditional instructional design methodologies to think like a learning engineer and educational researcher. You will walk through the key steps our team takes to design an effective learning activity, instrument it to collect meaningful data, and then evaluate the results to drive data-informed improvements. Along the way, we provide commentary and insights from both the instructional design and data science perspectives—challenges we’ve faced and lessons we’ve learned—empowering you to take the evaluation of the learning experiences you design into your own hands.