Online and Face-to-Face Candidate Learning Outcomes as Measured by edTPA

Concurrent Session 3

Session Materials

Brief Abstract

Using edTPA data, this presentation provides data informed evidence of the equivalency of online teacher candidates’ performance as compared to candidates completing the same face-to-face degree.  Modes of program delivery offer deferring outcomes suggesting the existence of underlying program structure inequities associated with epistemological conflicts.

Presenters

Tina Heafner is a Professor in the Department of Middle, Secondary, and K-12 Education at the University of North Carolina at Charlotte. Her administrative responsibilities include directing the College of Education Prospect for Success, M.Ed. in Secondary Education and the Minor in Secondary Education. Tina's research interests explore effective practices in social studies education such as professional development schools, technology integration, content literacy development, and service learning. Other research interests include policy and curriculum issues in social studies and content-based online teaching and learning. Publications include seven co-authored books and four edited books. She has published numerous articles in peer-reviewed journals such as Teacher's College Record, Educational Researcher, Educational Policy, Peabody Journal of Education: Issues of Leadership, Policy, and Organizations, Kappa Delta Phi, Theory and Research in Social Education, Journal of Technology and Teacher Education, Teacher Education and Practice, The High School Journal, Journal of Digital Learning in Teacher Education.

Extended Abstract

Despite the pervasiveness of online learning in higher education, this delivery mode has yet to receive similar quality status of face-to-face learning.  The perception of inferiority is especially prevalent in teacher preparation.  Longstanding beliefs that face-to-face training is the only viable option continues to dominate teacher education. Yet, few studies challenge these epistemological stances due to limited program outcome data comparing online and face-to-face degrees. Central to determining the effectiveness of technology and the value of technology-mediated learning is program quality.  edTPA, a widely accepted, national measure of teacher readiness, is a valid and reliable instrument documenting candidate learning outcomes associated with program preparation.  Using edTPA data, this presentation provides data informed evidence of the equivalency of online teacher candidates’ performance as compared to candidates completing the same face-to-face degree.  Modes of program delivery offer deferring outcomes suggesting the existence of underlying program structure inequities associated with epistemological conflicts.

The research questions to be addressed in this presentation are:

  1. Are there differences in performance on edTPA summative and mean scores between teacher candidates completing an online licensure programs and those completing F2F licensure programs?
  2. Are there differences in performance on edTPA mean rubric scores for each of the 15 rubrics between teacher candidates completing an online licensure programs and those completing F2F licensure programs?

Procedures

edTPA data collected for one year for all program completers enrolled in a graduate certificate in teaching degree.  This program uses two modes of delivery: online and face-to-face. Using edTPA rubrics we conducted a one-way ANOVA to examine group differences in overall teacher performance for candidates completing a 100% online licensure program compared to candidates completing a traditional, face to face licensure program.  Additionally, we conducted univariate analyses for each of the fifteen rubrics that measure teacher effectiveness in their abilities to design, deliver, and assess instructional pedagogy in association with student learning outcomes. We included in our analysis effect size calculations using Cohen’s d for summative and mean scores as well as each individual rubric scores.  To further explore group comparisons, we included descriptive statistics for all edTPA performance scores.

Results

Results indicate no statistical difference between mean online and face-to-face teacher candidate performance on edTPA.  There were no statistically significant differences between semester group summative edTPA rubric scores as determined by one-way ANOVA (F(1,82) = .050, p = .824) or in average rubric mean scores as determined by one-way ANOVA (F(1,82) = .007, p = .933). Online teacher preparation produced similar learning outcomes on edTPA as face-to-face coursework.  Calculations of Cohen’s d revealed moderate effect sizes. Candidates who completed the 100% online program scored slightly lower (M=41.88, SD=8.32), but had less variance as compared to the face-to-face program completers (M=42.31, SD=9.21).  However, both mean scores were within the upper limits of national recommended professional performance standard range for edTPA scores (37-42).

To examine more closely how candidates performed, we conducted one-way ANOVA for each of the fifteen edTPA rubric means scores.  Results are reported in Table 3.  Effect size is the portion of variance explained by each factor and is reported as η2 in the subsequent table.  Statistical significance was found on three rubrics: learning environment, engaging students in learning, and providing feedback to guide learning.  For rubrics 6 and 7, face-to-face candidates score significantly higher; whereas, online candidates performed significantly higher on rubric 12.  On two of the three edTPA tasks, candidates completing the online degree program scored higher than candidates in the F2F degree.  Candidates in the online program demonstrated greater classroom readiness on Task 1 Planning (µ = 3.05 vs. µ = 2.96) and on Task 3 Assessment (µ = 2.69 vs. µ = 2.55), while face-to-face candidates scored higher on average for Task 2 (µ = 2.67 vs. µ = 2.90).   Online candidates had higher mean scores on 53% (n=8) of the fifteen rubrics comprising the three edTPA tasks.  Face-to-face candidates received higher mean scores on 27% (n=4) of the individual rubrics.  Candidates in both programs performed equally on three (20%) of the fifteen rubrics. 

Interpretations

Based on these results, we conclude that online learning is an equally effective platform for preparing teacher candidates when measuring candidate outcomes using edTPA.  Given that edTPA is a widely used measure of quality teaching skills, we contend that online teacher preparation is a viable option for preparing future teachers.  Findings from this study contribute to the limited research that contends online learning is equivalent and in some cases more effective than F2F learning (Hartshorne, Heafner, & Petty, 2011; Heafner, Petty, & Hartshorne, 2011; Heafner & Plaisance, 2012, 2014; Means et. al, 2013).

Moreover, there are differences in skill preparation provided by online and face-to-face models of delivery.  Feedback and assessment function different when mediated by technology.  The fact that online candidates significantly outperformed their face-to-face peers in providing feedback to support learning could be an indication of the utility of online assessment tools and the transference of these resources to classroom applications. Furthermore, online candidates outperformed face-to-face candidates on Task 1 Planning.  Results suggest instructional planning skills are effectively supported in an online learning context.  Perhaps the resistance to online learning (Benton, 2009) as a platform for training teachers is more of an epistemological conflict rather than a reality of a barrier posed by learning environments.  Online candidates were able to learn and effectively execute the majority of all teaching skills measured on edTPA in comparison to their face-to-face peers. 

In contrast, the gaps in online candidates’ learning associated with Task 2 and statistically lower performance on rubrics 5 and 6 suggest that online learning program delivery was not comparable in supporting candidates in their skills to deliver instruction.  The results could not be explained with summative data but were interpreted in the context of differences in program structures.  Online methods courses are only available during summer session limiting pedagogical instruction to five weeks; whereas, face-to-face methods courses are offered fall and spring semesters and span 16-17 weeks.  In addition, rarely does the summer methods course included relevant clinical experiences.  Because K-12 schools are not open during the summer, field-based learning was restricted to enrichment (e.g. summer camp) opportunities that were not necessarily related to content areas (e.g. an art/theater camp rather than a science classroom).  Furthermore, the office responsible for procuring all clinical placements for face-to-face candidates does not support online candidates in garnering school assignments during the summer or any other time of the year.  These structural differences are evidence of inherent inequalities of learning opportunities for online candidates and might explain why a difference was present in Task 2 and why the mean scores for online candidates was slightly below face-to-face candidates despite scoring equal or higher on three-quarters of the edTPA rubrics.  These differences bring to light the possibility that epistemological views of perceived inequalities of online programs (Benton, 2009) manifest in inequitable program differences.  Even though, online candidates may have been disadvantaged in course and clinical offerings, there were no statistical differences in overall mean and summative edTPA scores.  These outcomes confirm the value of online learning as a viable option for teacher preparation (Hartshorne, Heafner, & Petty, 2011; Heafner, Petty, & Hartshorne, 2011; Heafner & Plaisance, 2012, 2014; Means et. al, 2013).  We recommend that teacher educators consider without restraint the plausibility of online teaching degrees and examine closely the nuanced contexts of online learning to advance skills, such as assessment.