Diagnostic Assessment and Achievement of College Skills

Concurrent Session 1

Add to My Schedule

Brief Abstract

The Diagnostic Assessment and Achievement of College Skills (DAACS) is a suite of open source, online assessments and supports (both technological and social) designed to optimize student learning. A study conducted at two online colleges suggest that DAACS is associated with increased student performance and accuracy of predictive student success.

Presenters

Dr. Bryer is currently Executive Director at Excelsior College and Principal Investigator of a $3 million FIPSE First in the World grant to develop and test the Diagnostic Assessment and Achievement of College Skills (DAACS). He is also an Adjunct Associate Professor at the City University of New York teaching statistics in the Master's of Data Science program as well as a Research Associate for New York State's implementation of Positive Behavior Intervention Supports (PBIS). He earned his doctorate in Educational Psychology and Methodology from the University at Albany in 2014 after working as a Software Engineer for 10 years. Jason's research interests include quasi-experimental designs with propensity score analysis, data systems to support formative assessment, and the use of open source software for conducting reproducible research. When not crunching numbers, Jason and his wife are wedding photographers and proud parents to three boys.

Additional Authors

Extended Abstract

Institutions of higher education often base assessments of student readiness for college on placement exams in reading, writing, and mathematics (Bailey & Cho, 2010; Belfield & Crosta, 2012). These types of assessments are helpful in identifying students who may be at-risk academically and for facilitating placement of students in remedial or basic courses. However, placement exams have limitations. They do not generate individualized feedback regarding students’ academic strengths and weaknesses, nor do they empower students to access resources that can ease the transition into college. Additionally, typical placement-based assessments fail to provide much, if any, information about the quality of other academic competencies needed to succeed in college, including and especially self-regulated learning, which is a malleable skill linked to student success (Zimmerman, Moylan, Hudseman, White, & Flugman, 2011; Zimmerman & Schunk, 2011).

To address the shortcomings of traditional placement exams, a no-stakes assessment and feedback system called the Diagnostic Assessment and Achievement of College Skills (DAACS) was created. The primary goals of DAACS, which consists of reliable and valid measures of core academic (i.e., reading, mathematics, and writing) and self-regulated learning (SRL) skills, are to generate information that newly enrolled college students can use to enhance their awareness of personal strengths and areas for improvement, and to translate their results into actionable steps for optimizing success in college, such as accessing technological and social supports. By assessing and providing actionable feedback about three core academic areas and the SRL skills that often underlie student success, it was hypothesized that DAACS would help students improve their skills in managing the challenges of higher education and, ultimately, have more positive achievement-related outcomes (Johnson, Rochkind, Ott, & DuPont, 2011). Second, as an institution-wide intervention, another goal of DAACS is to improve predictive models of student success which can inform interventions tailored to student needs.

Structure of the DAACS System

DAACS is a suite of open source, online assessments and supports (both technological and social) designed to optimize student learning (see demo.daacs.net). It has four major components, as illustrated in Figure 1, designed to change behavior that drives academic success by providing contextualized feedback and resources (e.g., recommendations for strategy use, open educational resources) to build students’ self-awareness, strategic skills, and motivation. Consistent with the formative feedback literature (e.g., Hattie & Timperley, 2007; Shute, 2008; Wiliam & Thompson, 2007), DAACS feedback can increase student awareness of the discrepancy between their current and desired skill levels, and provide suggestions about how to improve. As a result, students have a greater likelihood of enhancing their performance. The following sections describes the four components of DAACS in more detail.

Component 1: Diagnostic Assessments

DAACS includes three disciplinary (reading, math, writing) and one general (self-regulated learning) diagnostic assessments. Students take the DAACS assessments soon after enrollment but before they begin taking classes, generally during new student orientation. Unlike traditional placement exams, which provide only a pass/fail score and are used to place students into remedial courses, the four DAACS assessments provide students with information about their strengths and areas of weaknesses prior to starting their academic career so they can work on these areas while taking credit-bearing courses.

Component 1.1: SRL survey. The SRL survey consists of 47 Likert-type items adapted from established SRL measures (Cleary, 2006; Driscoll, 2007; Dugan & Andrade, 2011; Dweck, 2006; Schraw & Dennison, 1994). The 47 items cover three domains: metacognition, motivation, and learning strategies. The SRL survey has excellent psychometric qualities, suggesting that the inferences drawn from the survey scores are valid and reliable (Lui et al., 2018).

Component 1.2: Writing assessment. The writing assessment asks students to summarize their SRL survey results, identify specific strategies for improving their SRL, and commit to using them. Thus, the writing assessment not only assesses newly enrolled students’ writing skills, but also engages them in reflecting on and planning to develop their skills in SRL.

LightSide, an open source, automated essay scoring program, was trained to score the writing assessments in terms of nine criteria related to effective college-level writing (Yagelski, 2015) and provide students with feedback in one minute. Initial exact percent agreement between LightSide and human raters ranged from 47.17 to 73.45%, with an average of 66.34%. Adjacent score agreement was at least 95% for all nine criteria.

Components 1.3-1.4: Mathematics and reading assessments. The mathematics and reading assessments are computer-adaptive tests with 18 to 24 multiple choice items adapted from state-mandated high school English language arts and mathematics exams that are useful for identifying college readiness (Han, 2003; Jirka & Hambleton, 2005; Massachusetts Department of Elementary and Secondary Education, 2017; New York State Education Department, 2014a, 2014b). The DAACS reading and mathematics have acceptable psychometric properties, including Cronbach’s alphas of 0.67 and 0.69, respectively.

Component 2: Feedback and Resources

Two aspects of DAACS are designed to promote self-directed learning: 1) the immediate feedback students receive upon completing the diagnostic assessments, and 2) links to Online Educational Resources (OERs) related to individual students’ results.

Component 2.1: Immediate feedback. The feedback and resources provided to students by the DAACS is an especially powerful and unique aspect of its design. Students receive not just scores on the four assessments just moments after completing them but also feedback about their performance, and links to and encouragement to use relevant open education resources to boost their skills.

Component 2.2: Open Educational Resources. Two new OERs were created for DAACS: the SRL Lab (srl.daacs.net) and the Reading Comprehension OWL (owl.excelsior.edu/orc). A library of pre-existing math-related OERs has also been curated and continues to be updated. Institution-specific resources, such as the Online Writing Lab, are also linked to relevant feedback.

Component 3: Academic Advising with DAACS Results

Students in postsecondary education are typically assigned an academic advisor who assists them in course planning and helps solve problems. DAACS was designed to be an advising tool that enables advisors to access information about students’ academic strengths and weaknesses and use the information to focus advising conversations and help students set actionable goals that can lead to college success. For the purposes of the study reported here, academic advisors were provided with at least six hours of training, links to their students’ DAACS results, and a guide to prioritize students’ needs and connect them to useful resources.

Component 4: Predictive Modeling

As institutions serve more students with fewer resources, it is critical to be able to identify academically at-risk students early in their programs and provide robust academic and motivational supports. To date, the inclusion of DAACS data in predictive models of student success in their first term increases the accuracy of models by as much as 6% over baseline models currently used by the two collaborating institutions. This information can be utilized by institutions to prioritize outreach to students and/or monitor students as they begin coursework.

 Research Study

A large randomized control trial was conducted with undergraduate students at two private, nonprofit, online colleges (n=21,381) enrolling between April 15, 2017 and December 31, 2017. Students were randomized into one of two versions of the orientation course for newly enrolled students: with DAACS (treatment; n=10,634) and without DAACS (control; n=10,747). As incoming students, all students were required to complete orientation before beginning courses. Students in the treatment condition (1) took all four DAACS assessments, (2) received individualized feedback (3) received suggestions based on their results SRL (see Table 2); and (4) were assigned to one of approximately 350 academic advisors trained to use DAACS information during their advising.

Results

Chi-square tests of independence were performed to test the relationships between treatment assignment and on-time progress, and between treatment assignment and course success rate. The relation between these variables was statistically significant at Institution A (c2 = 4.14, p = 0.04) but not at Institution B (c2 = 3.38, p > 0.05). There was no statistically significant difference in course success rate (measured as credits earned-to-attempted) at either Institution A (t3504 = -1.42, p > 0.05) nor Institution B (t12495 = 1.13, p > 0.05).

Post-hoc correlational analyses were conducte d to examine the relationships between student success and the use of three main DAACS treatment components detailed in Figure 1. Students who utilized DAACS feedback were more likely to complete their first six months of coursework on-time (Institution A: c2 = 3.9, p < 0.05; Institution B: c2 = 51.9, p < 0.01) and were more successful in earning credits (Institution A: t2590 = 8.4, p < 0.01; Institution B: t10378 = 20.6, p < 0.01).

Discussion

This proposal is part of a large-scale, field-based experiment involving the development, implementation, and evaluation of DAACS. Given that DAACS is an open source assessment-to-feedback system and integrates academic and SRL skills, it has the potential to enhance the academic and regulatory skills of a large number of college students without expensive and time-consuming remediation courses. This proposal represents an initial attempt to provide empirical evidence for the effectiveness of DAACS in improving critical success outcomes.