Learning to Fly: Evaluating Changes in Teaching Practices in Graduate Teaching Assistants through Online Professional Development

Concurrent Session 8

Brief Abstract

Learn about our online faculty development program for Graduate Teaching Assistants who are embarking on their teaching careers and our evaluation of the program’s impact on changes in perceptions of online teaching practices. We will share our program, evaluation approach details, and the preliminary results from our case study.

Extended Abstract


The Department of Teaching and Learning with Technology at Texas Woman’s University offers numerous professional development opportunities to faculty, adjuncts, and graduate assistants each year. One online faculty development program is used to fuel excitement and motivate Graduate Teaching Assistants (GTAs) about pedagogy and online teaching practices as they begin their teaching careers. We are in the process of running an exploratory case study on our PILOT (Practices for Instruction and Learning for Online Teaching) program to gain understanding about ways that the PILOT program promotes changes in teaching practices for GTAs. In our presentation we will discuss the PILOT program and share details about the inner workings of PILOT as well as present our preliminary findings from the exploratory case study.


Participants in the PILOT program are GTA who will be teaching online or hybrid courses for the first time. They come from a variety of different academic disciplines and backgrounds. PILOT participants complete a four week online course during the summer where they receive content, instruction, and support regarding online pedagogy and online teaching practices. By the end of the program, each participant is required to create one online module for their course. Participants can create a module that they will use to teach in an upcoming course or create a “dream” module on the content of their choice. During the program, the GTAs have an instructional designer mentor and a GTA peer mentor from the previous year’s PILOT cohort, who provide feedback and support as they develop content and activities. Topics covered in PILOT are accessibility, collaboration, feedback and engagement, rubric development, instructor presence, community building, learner to learner and instructor to learner interactions.

The intent of PILOT is to offer an abundance of support to GTAs as they begin to think about teaching online. GTAs often have questions and concerns about their first online teaching experience. Unfortunately, the support offered to GTAs varies greatly across programs and departments. PILOT is an attempt to centralize this support and provide an avenue for GTAs to ask questions in a supportive environment. Besides offering support, the program also attempts to allow GTAs to build online materials in an environment with expert assistance.   

The PILOT program begins with a Preface Module that provides information about our learning management system and gives participants a crash course in navigating and creating content. Following the Preface Module, participants begin the process of setting up a structure for their final modules. They are required to create a module folder in a shared Blackboard shell, create at least one measurable course objective, and add content (a link, an infographic, a document, an embedded video, etc.) with an explanation of why they choose that content. Participants are also asked to complete a discussion board posting introducing themselves and discussing their experiences with teaching, with Blackboard, and their perceptions of what makes a successful online student.

During Week 2, we continue building the module by adding in an interactive assignment, such as a discussion board, wiki, blog, or a third party interactive tool. After participants have created their interactive element, a fellow PILOT participant visits their module, engages with the activity and then records their experience in an interactivity log. This activity provides students with an opportunity to look at one another’s content and reflect on how it works. It also gives them valuable experience at seeing a course through the student lens. In the Week 2 Discussion Board, students discuss ways to build community within their online courses.

In Week 3, we continue this pattern of interaction between participants. Each PILOT member creates an assessment, typically a test, survey, or assignment, at the beginning of the week. Next, they complete a colleague’s assessment. Then, they provide feedback both to the person who completed their assessment (role playing as though they are interacting with a student) and to the person whose assessment they completed. Finally, each participant completes a journal post reflecting on their experience. This week is often the most intense week of the program because it requires multiple interactions, but it allows participants to practice practical skills and to experience assessments from the angle of both instructor and student. The reflective journal and feedback on their assessment also give participants an opportunity to think about what they could do to improve their assessment.

Finally in Week 4, each participant is asked to apply feedback they have received from the instructional designer, the GTA mentor, and their PILOT cohort to perfect their final module.  In the Week 4 discussion board, participants reflect on their module and discuss their successes as well as items they want to revise. At the end of the Week 4, participants are asked to create one final reflection journal expressing their thoughts on the program and their module creation experience. There are also several optional discussion boards in the course that allow participants to exchange resources, tools, and ideas as they work on their modules.


From previous implementations of PILOT, we have anecdotal evidence that certain activities and assignments seem to work well for our audience. Evaluations mention online discussions with peers from various disciplines and the design and building of an online module with the interactive activities are perceived as useful for their development. This evidence also suggests that there are changes in how the GTAs that go through this program approach and develop their teaching practices, but we want to investigate more deeply the “why and how” of the change. This led to the development of the case study approach to understand more about why the GTAs perceive certain activities as useful. We believe this will allow us obtain more information to enhance, modify, and make improvements to the PILOT program.

This case study uses Transformative Learning Theory as a conceptual framework to guide and focus analysis. We feel this approach will allow us to better address the overarching research questions of how PILOT changes understanding of online pedagogical, design, and management strategies. In the case study tradition, several data sources will be utilized to gain a well-developed sense of how GTAs perceive their experiences in PILOT. Data sources will include pre and post reflections, interviews, and the final module. The discussion forums and pre and post reflections will be analyzed to see if GTAs used different language and examples when responding to the same prompts.  The interviews will be held at the end of the program and will consist of questions pertaining to experiences of GTAs during the PILOT program. The interviews will be transcribed verbatim and coded for themes. A member check will be used to allow those interviewed to provide input and feedback on how well we understood and reported their experiences. The final module that each participant creates will be analyzed as well. Divergent data sources will then be triangulated to show where multiple data sources converge and report similar perceptions about experiences in PILOT.

This study will also illuminate participants’ perceptions of the usefulness of our training content and activities. Ultimately, data analysis will provide additional understanding of the perceptions of GTAs in this program and will give us an opportunity to reflect on ways to improve our course content and instructional practices. Our plans are to continue this line of research over several administrations of the program, looking at whether we are seeing similar or different perceptions of GTAs over time. This will allow us to consider timely modification to PILOT and also explore ideas for future research.

What to Expect:

We will share the structure and elements in PILOT, as well as present the preliminary findings of the case study. Preliminary results presented could consist of: analysis of the discussion board, and reflections, quotes and themes from the interviews, and information from the document review of their final modules. Conference attendees will be encouraged to provide their own feedback about professional development programs they have developed or attended and whether or not those opportunities created opportunities for meaningful change in teaching and learning practices. Throughout the presentation, we will also ask the audience to provide their recommendations for the classroom scenarios provided to PILOT participants, thus allowing the audience to join in on the reflective process. Handouts and presentation slides will be available as well. Handouts will include: lists and details of assignments and activities by week, full instructions for the final module creation activity, and presentation slides.

Session information:

This session will focus on:

  • The GTA development program design

  • The research project design and methods

  • Preliminary findings of the project

  • The recommendations for future research

By the end of the session, attendees will be able to:

  • Explain GTA development program

  • Explain the rationale of needing a GTA development program

  • Discuss preliminary findings of the study

Interaction elements:

  • Interactive questions where you can discuss your personal experience with professional development.

  • Interactive classroom-based scenarios from PILOT program where you will provide your responses via polling.