Development of the Student Online Learning Readiness (SOLR) Instrument: Focused on Social, Communication, and Technical Competencies

Concurrent Session 2

Session Materials

Brief Abstract

The Student Online Learning Readiness (SOLR) instrument developed and validated in this study could provide a guide on how to measure student competencies in online learning and what components should be included in their orientations or supports to enhance their student competencies in online learning.

Presenters

Assistant Professor of Instructional Design and Technology at School of Continuing and Professional Studies (SCPS) at University of Virginia

Extended Abstract

1. Introduction

       The ongoing efforts of researchers have continued to measure student readiness in online learning (Dray, Lowenthal, Miszkiewicz, Ruiz-Primo, & Marczynski, 2011; McVay, 2001; Parnell & Carraher, 2002; Smith, 2005; Watkins, Leigh, & Triner, 2004), and a number of student readiness instruments in online learning have been used in higher education (Bernard, Brauer, Abrami, & Surkes, 2004; Dray & Miszkiewicz, 2007; Kerr, Rynearson, & Kerr, 2006; Mattice & Dixon, 1999; McVay, 2001; Parnell & Carraher, 2003; Watkins et al., 2004). In addition, previous research has supported the positive relationships between student readiness and students’ academic achievement in online learning (Bernard et al., 2004; Dray et al., 2011; Kerr et al., 2006). The importance of the adequate social and academic support has been highlighted in order to enhance the students’ sense of belonging in online learning both for increased meaningful learning experiences and higher retention rates (Ali & Leeds, 2009; Atchley, Wingenbach, & Akers, 2012). For these reasons, the purpose of this study is to develop a more specified instrument designed to measure student readiness in online learning through a focus on social, communication, and technical competencies. The development of a new instrument to measure distance learners’ online learning readiness is significant for the future of the field of online learning and will provide useful and practical suggestions for administrators and educators in higher education as well as for the distance learners themselves.

The specific research questions addressed in this study are:

1. Which set of items should appropriately be included in the final instruments based on analyses of psychometric properties of the developed instrument that measures social competencies, communication competencies, and technical competencies?

2. What is the reliability and validity evidence of the developed instrument to measure social competencies, communication competencies, and technical competencies?

 

2. The Student Online Learning Readiness (SOLR) Model

       The theoretical framework for the Student Online Learning Readiness (SOLR) Model derived from Tinto’s (1975) Student Integration Model (SIM). In his studies, he has proved the significance of students’ social integration to increase student retention in higher education by enhancing academic integration and helping students to form learning communities (Tinto, 1975; Tinto, 1998; Tinto, 2000; Tinto, 2005; Tinto, 2006; Tinto, 2008). According to him, the main component of social integration is the quality of students’ interactions with instructors and classmates (Tinto, 1975, Tinto, 2000; Tinto, 2005; Tinto, 2006). In addition, he stressed a positive effect of social support on student retention (Tinto, 1975; Tinto, 1998; Tinto, 2000; Tinto, 2005; Tinto, 2008). In fact, Tinto’s (1975) Student Integration Model (SIM) was based in the traditional face-to-face classroom setting. But, the principles remain the same for learners in online learning environment. For this reason, to expend Tinto’s social integration to online learning environment, Yu (2014) proposed the Student Online Learning Readiness (SOLR) Model as a new conceptual model for student retention in online learning as shown in Figure 1.

 

3. Methods

3. 1. Research Context

       A survey was created and administered using the Qualtrics program, and the survey links were distributed through Blackboard Learn in the spring 2014 and 2015 semester. For exploratory factor analysis (EFA) and reliability analysis, twelve online courses at a large Midwestern university were selected across program areas. For confirmatory factor analysis (CFA), twenty-six online courses that were offered by the Extended Campus at the same university were selected. The response rate was 20.61%. The survey links were posted in the announcement folder of each online course. The Extended Campus offers online courses that would be offered across the university system and support the online course development by providing one-on-one instructional design consulting. Data were checked for duplicate responses by comparing participating student names and email addresses, and duplicate responses were removed.

 

3. 2. Data Collection

3. 2. 1. Student Online Learning Readiness (SOLR) Survey

       The SOLR survey instrument (Yu, 2014) was administered to students to gather data using an online survey program. From the review of literature, 22 self-reported items were selected for exploratory factor analysis (EFA) and reliability analysis and 20 items were used for confirmatory factor analysis after EFA. All SOLR survey items were measured on a 5 point Likert scale (1=Disagree, 2=Tend to disagree, 3=Neutral, 4=Tend to agree, 5=Agree).

 

3. 3. Data Analysis

3. 3. 1. Statistical evidence of validity with Exploratory Factor Analysis (EFA).

       Exploratory factor analysis (EFA) is a statistical method that increases the reliability of the scale by identifying inappropriate items that can then be removed. It also identifies the dimensionality of constructs by examining relations between items and factors when the information of the dimensionality is limited (Netemeyer, Bearden, & Sharma, 2003). For this reason, EFA is performed in the early stages of developing a new or revised instrument (Wetzel, 2011). Before performing EFA, measurement appropriateness for the 22 survey items was evaluated through use of descriptive statistics. After the normality of the distribution was confirmed, the exploratory factor analysis was conducted through use of the Statistical Package for the Social Sciences (SPSS, version 22).

 

3. 3. 2. Reliability analysis.

       The reliability of an instrument or questionnaire is concerned with the consistency, stability, and dependability of the scores (McMillan, 2007). For this reason, the internal consistency was tested using Cronbach’s alpha for each competency in SPSS.

 

3. 3. 3. Confirmatory Factor Analysis (CFA).

       Confirmatory Factor Analysis (CFA) was conducted to verify the Student Online Learning Readiness (SOLR) instrument by using Linear Structural Relations (LISREL, version 8.8). The main purpose of running CFA is to examine the relationships among the latent and manifest variables supported by logic or theory (Schreiber, Stage, King, Nora, & Barlow, 2006). Multiple goodness of fit indices have been developed such comparative fit index (CFI), the goodness of fit index (GFI), and root mean square error of approximation (RMSEA). The interval of CFA and GFI between 0 to 1, and closer to 1 means there is higher relations between variance and covariance (Schreiber et al., 2006). According to the previous research, above .95 of CFA, 0.90 of GFI (Hu & Bentler, 1999) and below .05 of RMSEA (Browne & Cudeck, 1993) indicate the excellent model fit respectively. Hu and Bentler (1999) also recommended reporting Incremental Fit Index (IFI) to identify the degree of model fit.

 

4. Results

       As a result of our exploratory factor analysis (EFA), four factor-structures of the instrument of student readiness in online learning explained 66.69% of the variance in the pattern of relationships among the items. All four factors had high reliabilities (all Cronbach’s α > .823). Twenty items remained in the final questionnaire after deleting two item which cross-loaded on multiple factors (social competencies with classmates: 5 items; social competencies with instructor: 5 items; communication competencies: 4 items; and technical competencies: 6 items). As a result, the four-factor structure of the Student Online Learning Readiness (SOLR) instrument has been confirmed through this study.

       The Confirmatory Factor Analysis (CFA) results presented that the hypothesized model of 20-item structure of the SOLR instrument was verified as a good fit for the data (χ2 (164, N=347)=1959.94, p<.001, IFI=.81, CFI=.81, GFI=.55, RMSEA=.016). The variance/covariance matrix for the 20 items is presented in Table 3. The obtained t values were significant at p<0.00 1 because the ranges were between 9.31 and 17.09 which are greater than 3.29 (Hatcher, 1994). As shown in Figure 2, the completely standardized loadings are ranged between 0.47 and 0.77. Finally, the results of the CFA confirmed that the model fit is good between the proposed model and the observed data.

 

5. Conclusion

       This study provides several implications for research. First, the effect of social integrations from Tinto’s (1975) Student Integration Model (SIM) has been verified by the previous literature in the United States. Thus, this study contributes to expanding the research area of the Student Integration Model (SIM) to online learning environment. Second, the results of this study have confirmed that the Student Online Learning Readiness (SOLR) instrument can be used to measure the level of student readiness in online learning before they take an online course. Third, this study provides a reliable instrument for the researchers and practitioners in higher education to measure their students' social, communication, and technical competencies in online learning.

       This study also provides two suggestions for practice. First, it provides an idea to consider what types of psychometric properties should be measured to better understand student social readiness in online learning. It is true technological issues such as computer skills, Internet connection, and Learning Management System (LMS) navigation ability have an impact because those are main components of the online learning environment. However, technological skills will not guarantee an improved learning experience alone. Although the online learning environment differs from the traditional face-to-face classroom learning environment, instructors and students still play a main role in the process of learning in an online course. For these reasons, educators and administrators in higher education need to pay more attention to distance learners’ competencies in online learning (e.g. social competencies, communication competencies, and technical competencies).

       Second, this study provides a suggestion regarding the kinds of supports needed for distance learners to succeed in online learning. To improve the lower retention rate in online learning, institutional supports such as freshmen orientation before taking an online course are significant (Ali & Leeds 2009; Cho, 2012; Lee & Choi, 2011). The Student Online Learning Readiness (SOLR) instrument developed and validated in this study could provide a guide on how to measure student competencies in online learning and what components should be included in their orientations or supports to enhance their student competencies in online learning.