A small, diverse team of professionals collaborate around a table, working on laptops. Two team members share a fist bump.

“My course isn’t going to be ready in time…can I get a few more days?”

We heard this more often than we’d like from participants in our instructor-led course development Seminar. Despite our best efforts—weekly meetings with instructional designers, hands-on support from educational media consultants, and a well-designed online resource hub—many instructors struggled to finish building their courses on time. Something had to change. What we discovered challenged our assumptions about support, ownership, and what instructors really need.

The Challenge: Building Better Online Courses—On Time

When we first launched our course development seminar, we had high hopes. The format was fully online, with weekly Zoom meetings and a companion course site. A small team of instructional designers and media specialists supported instructors as they built their courses. We covered the essentials—learning objectives, assessments, engagement strategies—and offered robust media support. But despite our best efforts, only about half of the instructors finished building their courses on time. And during our first review, completed courses were often missing key components from the nationally recognized rubric we use.

Clearly, something wasn’t working.

Iteration in Action: What We Tried

Over several semesters, we experimented. We moved to in-person sessions, extended the synchronous portion of the Seminar from 5 to 6 weeks, and scaled back media production. We focused more on rubric standards and offered self-recording options like a lightboard room and desk-casting studio. These changes helped—slightly. But the core issues remained: media bottlenecks; inconsistent evidence of rubric components in first drafts of completed courses; and missed deadlines.

This past semester, we hit on a model that finally clicked.

What Worked: A New Model for Course Development

1. A Seminar Structure Tightly Aligned With Rubric Standards

We redesigned the Seminar around the rubric we use, dedicating each week to key scoring components:

  • Weeks 1–2: Learning Objectives
    Instructors drafted and refined course-level and one set of module-level objectives with peer and instructional designer feedback. This front-loaded work gave them a clear blueprint for building the rest of the course.
  • Week 3: Learning Activities
    We introduced tools like workload calculators and brought in experienced online instructors to share what worked for them—and what didn’t—in asynchronous environments.
  • Week 4: Assessments
    Without access to online proctoring, we explored alternatives to traditional multiple-choice exams and emphasized frequent, low-stakes assessments and instructor feedback.
  • Week 5: Community & Instructor Presence
    We focused on strategies for building classroom community, even without synchronous sessions (and what to do without Flip). Instructors created plans for regular and substantive interaction using tools already available in our digital learning environment.
  • Week 6: Authoring & Polish
    We showcased examples of well-organized courses, discussed visual design, and emphasized accessibility. Instructors finalized one module in their courses to set a pattern for the rest.

 

2. A Strong Follow-Up Phase

After 6 in-person sessions, we continued with 6 more weeks of weekly check-ins and consultations. Instructors committed 10–15 hours per week to course development, supported by an instructional designer and a media consultant.

3. Modeling Best Practices in the Online Companion Course

The online portion of the Seminar modeled each rubric standard, with callouts highlighting key features (see Figure 1). This meta-approach helped instructors experience good design as learners and provided clear examples of the concepts we’d reviewed in person.

A screenshot of a pop-up from a video playback tool. The pop-up explains that the user will be presented with the duration of the video so that they may decide whether or not to watch the video immediately or save it for later. The pop-up also explains that all videos are posted in caption-friendly services, enabling learners to utilize accessibility tools.

Figure 1. Example of a callout in the online companion course.

The Results: Quality, Completion, and Confidence
  • 100% of instructors completed their courses on time.
  • 85% of planned media was ready for students by the time we reviewed courses (about 3 weeks before course launch), with all media in place by the first day of classes.
  • Rubric scores averaged over 90%, with minimal revisions needed.

 

This was our best outcome yet—and it didn’t require more support. In fact, we reduced some of it.

Counterintuitive Insight: Less Support, More Ownership

One of our biggest surprises? Reducing support actually improved outcomes.

Previously, we tried to “do it for them”—handling many media production and design tasks ourselves. But this shifted ownership away from instructors. When designers hit content roadblocks, progress stalled.

Now, we empower instructors to take the lead. Our “Media Moment” sessions teach them how to plan, record, and embed videos themselves. Next semester, we’ll add early consultations to ensure video quality remains high and instructors plan ahead for accessibility.

Centering the Student Experience

Another game-changer was our “Student Spotlight.” A recent graduate—now a part-time instructional designer in our unit—joined the Seminar to share insights from the learner perspective. Her feedback helped instructors focus on what really matters to students.

We’re now exploring models for piloting a learner experience survey on behalf of volunteer instructors to gather even more student-driven insights.

Takeaways for Other Institutions

No matter your structure or resources, here are a few strategies that might work for you:

  • Align your seminar with a clear framework (start with a validated quality assurance rubric or scorecard) and build synchronous sessions around it.
  • Model best practices in your own course design. If you don’t already have a template course, build one that would meet the success criteria for the evaluation rubric you’re using.
  • Empower instructors to own their content—especially media—but provide frequent modeling and support so quality remains high.
  • Bring in student voices to keep the learner experience front and center. We think this works best if the students have taken online courses at your institution.
  • Follow up with structured support to keep instructors on track with interim deadlines.
  • Manage up by keeping lines of communication with deans, department chairs, or others who oversee course listings to cultivate their buy-in and support for course development and quality assurance processes.
 
Final Thoughts

Course development is never one-size-fits-all. But with the right structure, support, and mindset, it can be both scalable and sustainable. Our latest iteration proves that small shifts—grounded in reflection and feedback—can lead to big results.

Headshot of Diana TheisingerDiana Theisinger is the Digital & Professional Learning Program Manager at William & Mary’s Studio for Teaching & Learning Innovation. She oversees online course development processes across the institution, including Academy, William & Mary’s platform for continuing and professional education opportunities. Outside of work, you can probably find her hiking the Appalachian Trail.

Read More from OLC Insights

Orlando, FL | November 17-20, 2025

OLC Accelerate showcases groundbreaking research and highly effective practices in online and digital learning across K-12, higher education, and corporate L&D. This event is designed to empower and support leaders, instructional designers, educators, and training professionals by offering a wide range of sessions and activities.

We use cookies to enhance your experience. By continuing to visit this site you agree to our use of cookies. More info