Engaging Students and Improving Online Learning: Not Your Typical Earth Science Class

Concurrent Session 9

Session Materials

Brief Abstract

Engaging students in online classes is critical for student success.  However, research on student interaction with content in a digital setting and its effects on engagement is lacking.  We examine the effects of using technology to promote student interaction with content in an introductory online Earth Science class. 

Presenters

I am a non-tenure track instructor in the Department of Earth Sciences at Montana State University, in Bozeman, Montana. I received my M.S. in Earth Sciences in 2013 studying snow, avalanches, weather and climatology and am currently working toward a PhD in Adult and Higher Education focusing on online geoscience education. I have a passion for teaching students from a wide range of backgrounds online, in the classroom, and in the field . I have taught face-to-face and online classes in Introductory Earth System Sciences, Introduction to Snow Science, Oceanography, Earth Materials, Weather and Climate, Climate Change and Snow Dynamics and Accumulation.

Additional Authors

Extended Abstract

Engagement in online learning is critical to student success (Meyer, 2014). Success rates in online classes are markedly lower than in face-to face classes.    Estimates of the withdrawal rate are 10 – 20% greater than for face-to-face classes (Herbert, 2006).  Other estimates indicate that 40% and 80% of online students drop classes before completion (Smith, 2010).  Nevertheless, more and more students are enrolling in online classes due to convenience, location, or family and work schedules.  A recent trend shows an increase in competition for admission to brick and mortar institutions among a growing population of university-aged students and a static supply of physical institutions (Online Learning Consortium, 2015)

As enrollments increase, the need for quality online education is essential.  Research focusing on online engagement is centered mainly on engagement resulting from social interactions.  Studies exist on engagement through discussions, peer review, collaboration, and the technologies and methods used to create interactions and gather data (Beldarrain, 2006; Moreillon, 2015; Thomas, West, & Borup, 2017).  Information is limited, though, on how interaction with content and course materials affects engagement. Because this body of knowledge is small, the methods or technologies used to foster student engagement with content have been insufficient. The present study seeks to fill this gap in the literature.

In face-to-face classes, student participation is expected, yet in online instruction, student engagement and motivation is often lacking or looks quite different (You, 2016).  Students who do not engage with the content or the learning process are at risk of dropping out.  In addition, students who are not engaged are not expected to perform as well as engaged students on learning assessments.  Engagement in online classes is difficult to measure directly, but we hope to be able to identify the learning management system (LMS) variables related to engagement with content that have the largest effect on student success and test those for differences across groups. 

Research Questions

1)    What are the variables that relate to student interaction with content?

2) Which of the variables relating to student interaction with content have the largest effect on student learning outcomes?

3) Is there a significant difference in learning outcomes relating to the variables with the largest effects on those learning outcomes?                                                                                                                 

Engagement is a vital component of online learning in higher education and is linked to student success (Delen & Liew, 2016; Meyer, 2014).  Various studies suggest that higher levels of engagement positively affect higher order thinking, student learning, retention and student satisfaction (Meyer, 2014).  Here, engagement is the “time and effort students devote to their academic experiences” (Kuh, 2003); however this definition continues to evolve. More specifically, “active engagement” (Morris, Finnegan, & Wu, 2005) has been identified as a significant component of the online learning process and can be measured by analyzing students’ online course involvement.  Several studies have examined results from the National Survey of Student Engagement (NSSE) and found mixed levels of engagement based on major, grade point average, and age (Meyer, 2014).  After controlling for the above characteristics, Chen, Lambert, and Guidry (2010) found a positive correlation between technology and student engagement . 

Current trends in online education include a greater demand for teaching techniques that effectively utilize new technologies (Howell, Williams, & Lindsay, 2003). Garrison, Anderson, and Archer’s (1999) Community of Inquiry model presents a theoretical framework for online learning where teaching presence, social presence, and cognitive presence combine to create an enriching educational experience.  Shea, Li, and Pickett (2006) broke teaching presence into two components: instructional design and organization, and directed facilitation.  Many studies focus on the effects of directed facilitation in online instruction by examining student-instructor and student-student interaction (Beldarrain, 2006; Childers & Berner, 2000; Thomas et al., 2017).  Only a handful of studies examine the effect of student-content interaction on student engagement online.  Castano-Munoz, Duart, and Sancho-Vinuesa (2014) examined hands-on learning activities in online classes and proposed that these activities could lead to greater understanding of course material, increased online study time, and better student retention in online classes.  Czerkawski and Lyman (2016) introduce their E-Learning Engagement Design (ELED), which highlights the importance of instructor facilitation through teaching presence, and course design and organization.  Course design and organization includes presenting content in a way that promotes student engagement and ultimately student retention.  Prior studies have also examined the use of technologies promoting social interaction in the online classroom (Branon, 2001; Beldarrain, 2006; Moreillon, 2015).  But, these studies do not address using technology to encourage student interaction with content.

Many studies have assessed student engagement using surveys and questionnaires (Ma, Han, Yang, & Cheng, 2015), which can be subjective and victim to voluntary response bias.  Theoretically, the use of learning analytics, collecting and using online learner data, to enhance the online learning experience (Siemens & Long, 2011) can provide a quantitative measure of student engagement.  You (2016) used learning analytics to isolate factors most closely tied to online engagement.  She examined indicators of regular study and time management and identified four related to self-regulated learning.  In the order of greatest significance, these were regular study, late submissions, number of sessions, and verification of reading the syllabus.  Ma et al. (2015) used learning analytics to assess the effect of the instructor on student engagement and concluded that designing high-quality teaching materials that include hands-on activities and immediate feedback is essential to effective learner-content interaction.  Their analysis used both frequency and time spent participating in an online class as gauges of student engagement.  Further research centered on using learning analytics relating to student-content engagement is important to provide a quantitative measure of learner engagement in online classes.

This online learning study was conducted in an introductory online Earth Science course over three semesters, Fall 2017, Spring 2018, and Fall 2018 at a mid-sized university in the western United States.  The sample population included all students in this online course.  For this comparative study, the researcher used a quasi-experimental research design.  Specifically, she used a post-test only control group design and separated students into two groups by matching participants by year in college and major to minimize effects of aptitude and higher order thinking skills.   Students with majors in a STEM field and students with more college experience should perform better on learning assessments, so distributing these students equally among groups, controls variables that could affect results.  All students had access to standard learning materials for the course: lecture slides, recorded lectures, and the course textbook.   In addition to the standard course materials, the experimental group used interactive online lessons for half of the chapters of the text.  More interactive lessons were created for initial chapters rather than those later in the textbook.  The researcher designed the lessons to cover the same topics addressed in the standard course materials using the same images, vocabulary, and examples.  The main difference between course materials used by the two groups was the addition of student interaction with course content in the experimental group.  The interactive lessons were created using Softchalk, an e-learning authoring tool.  Softchalk was used because of its compatibility with the learning management system used in the class and for its relatively shallow learning curve.  While Softchalk software was used in this study, similar e-learning authoring software could also be used to produce engaging interactive content for online use.  The researcher incorporated activities like drag and drop, matching, informational hotspots, photo albums, flashcards, “did you know?” pop-ups, and end-of-chapter quizzes to encourage active learning and retention, to provide instant feedback and to make the lessons compelling.  These lessons focused on student-content interaction, and were intended to engage students in their learning experience. The technology used in this study is distinct from other technology used to increase online engagement, such as social media, synchronous chat tools, videos and animations, and collaboration software (Beldarrain, 2006; Branon, 2001; Moreillon, 2015).                                                                                                                     

Due to the above factors, the results of this study can only be discussed with respect to students taking an online introductory Earth Science class at a similarly sized university in the western United States.  The sample size is small, and while the true population quantity is unknown, the sample size is most likely unrepresentative of the total population.  Following the Fall 2018 semester the researcher plans to conduct a factor analysis using learning analytics to identify the variables related to engagement with content and test those for significant differences between groups.

The researcher plans to present this study as a case study showing course methods and preliminary findings, presenting a slideshow that includes a demonstration of the interactive online lessons along with an opportunity for participants to sample the lessons.  In addition, the researcher plans to provide a handout highlighting the methods used and preliminary findings.  This presentation will share with participants how technology can be incorporated into an online STEM class to increase active participation.  It will provide new knowledge surrounding the benefits of engagement with online learning with content in a class (Introductory Earth Science) that is usually experienced as a traditional face-to-face offering.

References

 

Beldarrain, Y. (2006). Distance Education Trends: Integrating new technologies to foster student interaction and collaboration. Distance Education, 27(2), 139-153.    doi:10.1080/01587910600789498

Branon, R. E., Christopher. (2001). Synchronous and asynchronous communication tools in distance education. TechTrends, 45(1), 36-36. doi:10.1007/BF02763377

Castano-Munoz, J., Duart, J., & Sancho-Vinuesa, T. (2014). The Internet in face-to-face higher education: Can interactive learning improve academic achievement? British Journal of Educational Technology, 45(1), 149-159. doi:10.1111/bjet.12007

Chen, P.-S. D., Lambert, A. D., & Guidry, K. R. (2010). Engaging online learners: The impact of Web-based learning technology on college student engagement. Computers & Education, 54, 1222-1232.

Childers, J. L., & Berner, R. T. (2000). General Education Issues, Distance Education Practices: Building Community and Classroom Interaction through the Integration of Curriculum, Instructional Design, and Technology. The Journal of General Education, 49(1), 53-65. doi:10.1353/jge.2000.0001

Online Learning Consortium, (Producer). (2015, 04/01/2016). Keeping pace with the changing face of online learning. [Infographic] Retrieved from http://onlinelearningconsortium.org/wp-content/uploads/2015/03/Infographic_V9.pdf

Czerkawski, B., & Lyman, E. (2016). An Instructional Design Framework for Fostering Student Engagement in Online Learning Environments. Tech Trends, 60(6), 532-539. doi:10.1007/s11528-016-0110-z

Delen, E., & Liew, J. (2016). The Use of Interactive Environments to Promote Self-Regulation in Online Learning: A Literature Review. European Journal of Contemporary Education, 15(1), 24-33. doi:10.13187/ejced.2016.15.24

Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical Inquiry in a text-based environment: computer conferencing in higher education. The Internet and Higher   Education, 2(2), 87-105.

Herbert, M. (2006). Staying the course: A study in online student satisfaction and retention. Online Journal of Distance Learning Administration, 9(4). Retrieved from http://www.westga.edu/~distance/ojdla/winter94/herbert94.htm

Howell, S. L., Williams, P. B., & Lindsay, N. K. (2003). Thirty-two trends affecting distance education: An informed foundation for strategic planning. Online Journal of Distance Learning Administration, 6(3), 1-18.

Kuh, G. D. (2003). What We're learning about student engagement from NSSE. Change, 35, 24-31.

Lee, Y., Choi, J., & Kim, T. (2013). Discriminating factors between completers of and dropouts from online courses. British Journal of Educational Technology, 44(2), 328-337. doi:10.1111/j.1467-8535.2012.01306.x

Ma, J., Han, X., Yang, J., & Cheng, J. (2015). Examining the necessary condition for engagement in an online learning environment based on learning analytics approach: The role of the instructor. The Internet and Higher Education, 24, 26-34. doi:10.1016/j.iheduc.2014.09.005

Meyer, K. A. (2014). Student engagement online : what works and why. In K. A. Meyer & Aehe (Eds.): Hoboken, New Jersey : John Wiley & Sons.

Moreillon, J. (2015). Increasing Interactivity in the Online Learning Environment: Using Digital Tools to Support Students in Socially Constructed Meaning-Making. Tech Trends, 59(3), 41-47.

Morris, L. V., Finnegan, C., & Wu, S.-S. (2005). Tracking student behavior, persistence, and achievement in online courses. The Internet and Higher Education, 8, 221-231.

Shea, P., Li, C. S., & Pickett, A. (2006). A study of teaching presence and student sense of learning community in fully online and web-enhanced college courses. Internet and Higher Education, 9(3), 175-190.

Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE review, 46(5), 30.

Smith, B. (2010). E-learning technologies: A comparative study of adult learners enrolled in blended and online campuses engaging in a virtual classroom. (PhD Dissertation), Capella University, Ann Arbor, MI. (3413143)

Thomas, R. A., West, R. E., & Borup, J. E. (2017). An analysis of instructor social presence in online text and asynchronous video feedback comments. The Internet and Higher Education, 33, 61-73.

You, J. W. (2016). Identifying significant indicators using LMS data to predict course achievement in online learning. The Internet and Higher Education, 29, 23-30. doi:http://dx.doi.org/10.1016/j.iheduc.2015.11.003