Towards Increased Student Retention in STEM Disciplines: 5 Years of Rigorous Research and Improvement of Interactive Learning Material

Concurrent Session 9

Brief Abstract

Our goal: Increase retention in traditionally low-retention STEM disciplines via high-quality learning material. Our finding: Students learn more when text quantity is reduced and replaced with active learning tools, designed to build student confidence. Developing such material is hard, but we believe students should be highly-supported. Join us!

Presenters

Nikitha Sambamurthy is an engineering content author and researcher at zyBooks. She has a B.S. in electrical engineering and Ph.D. in engineering education from Purdue University.

Extended Abstract

In this engaging presentation, we describe the unique blend of interactive learning material that we have refined over five years of aggressive improvement, as well as the published research conducted on and with this material. Our work is supported by the Small Business Innovation Research (SBIR) programs at the National Science Foundation and US Department of Education.

Our unique style of interactive learning material has substantially less text than standard eBooks and online homework systems. We replace text with numerous embedded question sets designed for learning and not quizzing, numerous animations of key concepts, and some built-in tools. In a randomized-control experiment comparing 1 lesson of our material vs. static material, students overall learned 16% more with our interactive material [1]. More impactfully, the least-prepared quartile of students (often first-generation college and disadvantaged students) nearly caught up with the rest of the class, improving 64% more with the interactive material. Further, in a cross-semester retrospective analysis of courses that switched from static material to interactive material with the class not changing otherwise (e.g., same experienced instructor, and same semester in the year), exam scores improved by 14% and project scores improved by 7% [2].

We believe these significant improvements are due to the nature of our approach to learning design, including less text, more animations, and learning questions. Less text is used to explain the same concept, which is more difficult and more time-consuming to author than using normal wording. In a randomized-control experiment, students were randomly assigned a lesson with either normal text or less text [4]. Students assigned less text learned 118% more. Perhaps equally interesting, students voluntarily spent the same amount of time with each lesson, even though the normal text had 6 times as many words.

Another key element our frequent use of animations, consisting of a sequence of moving objects to visually show the dynamic relations between concepts. Animations are particularly powerful for showing processes, replacing a large figure with numerous steps (i.e., Figure 1a, 1b, 1c, etc) with an incremental reveal of each step, which is easier to digest and less intimidating for students. A third key element are learning questions, which change the reading from a monologue to a dialogue, wherein students are given hints and explanations for their specific answers. For example, each option of a multiple choice question has a specific explanation as to why that option is correct or incorrect. Such questions are part of the learning process, so students have unlimited attempts. Such questions are asked immediately after a concept is introduced, enabling the correction of common mistakes early, which has been shown to significantly improve learning. Other types of learning questions include short answer, true/false, and definition matches.

We call animations and learning questions "participation activities", as such activities are part of the core learning (in place of reading in a traditional textbook). In a controlled study across 10 universities, several thousand students completed 84% of participation activities when given the option to earn participation points [3].

"Challenge activities" (homeworks) consist of either one difficult coding prompt (coding challenge), or several progressively difficult levels of short answer questions (progression builder). Hints and solutions are not provided to students prior to submitting the answer. Failure to answer correctly results in feedback provided to the students who are prompted to attempt the problem again (for the coding challenge), or to try a new problem of similar difficulty (progression builder). In the aforementioned study, 75% of students completed the challenge activities when given the option to earn participation points [3]. A few instructors voiced concern over whether students would just "cheat the system" by clicking until they completed the activities. Across several thousand students, less than 3% cheated the system [3][5], even when no penalty was incurred for doing so.

Such participation and challenge activities are used as a basis for our interactive textbooks. Interactive textbooks are presented in a sequence of sections/chapters, like a traditional textbook. Because each section focuses on one specific concept, the sections are modular and can be rearranged to fit various instructor syllabi (also very friendly to competency-based education). Student completion of participation and challenge activities are recorded. Student interaction data can be viewed by instructors on a class-level or individual student-level for every book section or chapter; a student can view her/his own data. Data are provided as percentages calculated from the number of questions completed in a given participation or challenge activity.

Such data enables an instructor to assign points for completing the activities, which solves the read-before-class problem. In fact, one instructor said, "I assigned [interactive textbook] readings due every two weeks. I noticed a difference. The students would ask more involved questions. They noticed that the reading did the basics and that lecture reviewed the basics, but went more in depth -- covering the exceptions and unusual cases.” Another instructor wrote, “students read before class. Following on that, the questions students asked in lecture seemed to be less about syntax and more about design and high-level concepts" [2].

Our session will engage audiences via usage of multiple experienced speakers that encourage audience participation via numerous strategies, such as asking the audience to predict the results of a study before showing the actual results. The Q&A will open the floor to the audience to share their reflections and questions.

We believe students deserve high-quality learning material at a low-cost. We believe instructors deserve learning material and activity reports that supports their pedagogical goals. Through our rigorous authoring and continual improvement process, we think that we are on the right track. Join us!

 

References

[1] A. Edgcomb, F. Vahid. Effectiveness of Online Textbooks vs. Interactive Web-Native Content, Proceedings of ASEE Annual Conference, 2014.

[2] A. Edgcomb, F. Vahid, R. Lysecky, A. Knoesen, R. Amirtharajah, and M.L. Dorf. Student Performance Improvement using Interactive Textbooks: A Three-University Cross-Semester Analysis, Proceedings of ASEE Annual Conference, 2015.

[3] A. Edgcomb, F. Vahid, R. Lysecky, and S. Lysecky. Getting students to earnestly do reading, studying, and homework in an introductory programming class, ACM SIGCSE Technical Symposium on Computer Science Education, 2017.

[4] A. Edgcomb, F. Vahid, and R. Lysecky. Students Learn More with Less Text that Covers the Same Core Topics, Frontiers in Education Conference (FIE), IEEE, 2015.

[5] J. Yuen, A. Edgcomb, and F. Vahid. Will Students Earnestly Attempt Learning Questions if Answers are Viewable?, Proceedings of ASEE Annual Conference, 2016.