Learning Effectiveness

LEARNING EFFECTIVENESS means that learners who complete an online program receive educations that represent the distinctive quality of the institution. The goal is that online learning is equivalent to or better than learning through the institution's other delivery modes, in particular in its traditional face-to-face, classroom-based instruction. The course or program is designed to be at least equivalent in quality to face-to-face courses offered at the same institution. If there is no comparable face-to-face course, then the institution's normative benchmark applies. The learning resources in online courses generally include the same ones to be found in the institution's traditional face-to-face courses-learning media (books, notes, software, CD-ROMs, and so on); faculty who teach the class and are available outside of class; and learners who interact with the faculty and with each other. Because of technology, online courses are usually enhanced by resources available over the Internet and/or designed for computer presentation. Metrics demonstrate that the quality of learning online is at least as good as the institution provides through its traditional programs as measured by several means-by faculty perception; by outcomes assessments; by career, scholastic and professional achievement surveys and records; by feedback from employers; and by institutionally sustained, evidence-based, participatory inquiry into how well online programs achieve learning objectives. Online learning generally parallels the quality of face-to-face learning with equivalent content, standards, and support services. Online curricula are subject to, and thereby receive the same benefits of practice, process and criteria that the institution applies to traditional forms of instruction.

Effective Practice Awards Submissions Due June 30

Submitted by janetmoore on May 27, 2010 - 2:06pm
New effective practices  submitted by June 30 are eligible for awards to be presented at the July 21, 2010 Emerging Technologies for Online Learning Symposium Awards Presentation Luncheon.
Thousands visit effective practices for innovative practices supported by eviden
Author Information
Author(s): 
Dr. Cheryl Schauer-Crabb
Institution(s) or Organization(s) Where EP Occurred: 
Elon University
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

With increased demand from students in 2001, to supplement standard curriculum course offerings on Elon University’s residential campus, the registrar and disciplines across campus discussed methods to best serve student needs online over the summer hiatus. The Summer College @Elon pilot program began in 2002 with 12 courses, and has grown to 65. Since its inception, courses have only been offered during the summer for Elon undergraduates. As the pilot flourished and became a pillar of Elon’s summer course offerings, the training program evolved to meet faculty needs.

This cost-effective, replicable and scalable multidimensional model for training faculty to teach online incorporates consultations, conversations, a self-paced Moodle course, peer mentoring, and technology support. The focus of our multidimensional model is on acquiring skills critical to teaching online. Estimated cost for a cohort of ten faculty is $1,000; this includes eight lunch meetings and individual coffee consultations.

Faculty members are supported by instructional designers, instructional technologists, multimedia developers, videographers, and e-learning specialists. The support team creates video and multimedia tools for course engagement and assessment. In addition, they provide guidance and train faculty in new technologies for course integration (i.e. screencasting, Wordpress, audio and video editing, PowerPoint).

Additionally, equipment is reserved for online faculty specifically for course enhancement (i.e. webcams, microphones, portable flip video kits, tablets, styli); other mechanisms on campus are in place for equipment purchase. This supportive infrastructure is a critical element of the training efforts by Elon University’s Teaching and Learning Technologies.

Seven mechanisms are used to ensure effectiveness:
1. assessment in the Moodle course
2. individual faculty meetings to discuss the training program and student assessment results
3. focus groups
4. blog readership
5. the number of support calls the Technology Help Desk received
6. course offerings
7. faculty retention at 99% rate of return to the program

Description of the Effective Practice
Description of the Effective Practice: 

With increased demand from students in 2001, to supplement standard curriculum course offerings on Elon University’s residential campus, the registrar and disciplines across campus discussed methods to best serve student needs online over the summer hiatus. The Summer College @Elon pilot program began in 2002 with 12 courses, and has grown to 65 courses in 2014. Since its inception, courses have only been offered during the summer for Elon undergraduates. As the pilot flourished and became a pillar of Elon’s summer course offerings, the training program evolved from two instructional designers consulting individually with each faculty member to a multidimensional process.

This cost-effective, replicable and scalable multidimensional model for training faculty to teach online incorporates consultations, conversations, a self-paced Moodle course, peer mentoring, and technology support. Faculty members who teach online for Elon University, regardless of previous online teaching experience, complete a four month blended training program and are required to have taught the course previously in a face-to-face environment. The focus of our multidimensional model for training is on acquiring skills critical to teaching online through individual guidance, discussion, application activities, pedagogical exploration, and technology mastery. There are five segments:

1. consultations
2. lunch conversations
3. a self-paced asynchronous Moodle course
4. peer mentoring
5. technology support

The pedagogy of teaching online and Moodle training are combined into this comprehensive program. A description of each segment follows.

Consultations
The role of the instructional designer is critical to multidimensional model to online training. The close working relationship between the instructional designer and faculty members begins nine months before the course is taught and reinforces the fact that support is available when needed.

Faculty interact with the instructional designer regularly and receive personalized feedback and suggestions regarding student evaluations, course modifications, feedback on assignments and modules, and the organizational framework of the course. Individual consultations also help to ensure consistency between courses and across programs since there is no standardization for courses on campus. The Quality Matters course design standards serve as a baseline for consultations and course modification.

Faculty are encouraged to keep a journal noting their thoughts regarding revisions for the following iterations of the course, hurdles, and how assignments or topics could be modified for future consultations with the instructional designer. This serves as a springboard for conversations and anecdotal evidence of transformative teaching.

Lunch conversations are modeled after Vanderbilt’s Center for Teaching Course Design Working Groups:
http://cft.vanderbilt.edu/guides-sub-pages/course-design/. They emulate Grant Wiggins and Jay McTighe’s book, Understanding by Design (2005). Backwards design and student-centered learning environments offer a powerful framework for designing courses with outcomes of “enduring understandings” for students and working backwards to design evidence of that understanding.

Sessions occur from February to May, a timeframe suggested by faculty as most convenient. Conversations explore in depth course development, design and assessment, and compares traditional face-to-face teaching with online pedagogy. These collaborative cross-discipline conversations provide faculty with continuing opportunities to learn from each other. These sessions are consistent in regards to content, requirement, deliverables and activities from year to year to ensure consistent outcomes.

Additionally, there is a practical and applicable emphasis in the topics covered which are: syllabus construction, effective quiz question development, best practices in design, implementation, assessment, activities modification, integration of web and library resources, classroom and time management, virtual guest speakers, effective faculty and student communication, multimedia integration, and rubrics. Each lunch conversation provides opportunities for the new-to-online faculty to showcase portions of their course under modification. There is no standardization on campus for courses, but each online course meets benchmark standards in technology, interaction, and assessment.

To celebrate faculty completing their first online course, the last lunch meeting showcases their deliverables. The online community is invited to the event, as are deans and departmental chairs.

Moodle complements the overall training experience with application and experimentation assignments. In this self-paced asynchronous site available February through May, in-depth discussions are facilitated on topics covered in the face-to-face conversations. The cohort contributes to dynamic discussions where they reflect on the relevance and application of the course material to their own teaching situations, and key issues about teaching and learning online based upon current literature. Teaching Online: A Practical Guide, by Ko and Rossen (2010) is one springboard for discussion in the Moodle forums.

The site has modules for novice, intermediate, and advanced Moodle users to progress through culminating in an assessment. The course illustrates, demonstrates, and discusses advanced teaching strategies, challenges, best practices, current research, and trends. Included are videos and examples sharing how to use specific techniques in online teaching featuring our faculty.

Throughout the final weeks of the training course, faculty are constructing their course in Moodle: writing discussion questions, constructing activities to assess student achievement aligned with the learning outcomes, embellishing with screencasts and videos, integrating multimedia developed by the training team, and finding online tools and resources to repurpose. Completion of the Moodle training course assures faculty they have built a solid foundation for their own online course. Additionally, faculty have experienced learning online, contributed to discussions, viewed grades, submitted assignments, completed tests, and navigated the site successfully.

Peer mentoring and review parallel the pedagogical philosophies of online teaching and learning communities. Peer review of teaching is a widely accepted mechanism for promoting and assuring quality academic work.

The mentor selection process is subjective; the instructional designer identifies and invites two faculty members to serve as mentors to the online community with each new cohort. Invitations are extended to faculty based upon the online student feedback survey results (which are discussed individually in consultations with the faculty), creativeness of technology usage, and success of the course. Mentor responsibilities include:

• contribute to the Moodle discussions
• share resources and open online course for others to explore
• facilitate one lunch meeting
• serve as a point of contact for questions, advice about teaching, pedagogy, and Moodle
• integrate a new technology tool into the course
• review online courses and provide feedback

Peer mentoring is the conduit between the lunch conversations and Moodle course. Faculty are paired with an online mentor with whom they can confer and conduct course review. This component of the training program affords faculty the opportunity to think thoughtfully about the best way to transform their face-to-face strategies and practices to the online environment.

Technology support
Faculty are supported by instructional designers, instructional technologists, multimedia developers, videographers, and e-learning specialists. This supportive infrastructure is a critical element of the training efforts by Elon University’s Teaching and Learning Technologies. The support team creates video and multimedia tools for course engagement and assessment.

In addition, the support team trains faculty in new technologies for course integration outside of the five part training program in screencasting, Wordpress, audio and video editing, PowerPoint, and other tools as requested. Additionally, equipment is reserved for online faculty specifically for course modification (i.e. webcams, microphones, portable flip video kits, tablets, styli); other mechanisms on campus are in place for equipment purchase. This supportive infrastructure is a critical element of the training efforts by Teaching and Learning Technologies.

Faculty and students are supported through email, Moodle, and 24/7 assistance from the Technology Help Desk should any technologies issues arise during the summer online program. Faculty and students inquiries decreased significantly over the past seven years. This year there were less than 8 support calls placed to the Technology Help Desk as compared to over 40 seven years ago.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

At Elon the multidimensional model has been successful in preparing faculty for their first online teaching experience. The training provides faculty with a solid understanding of online education while simultaneously pairing them with experts to assist with course modification and the adjustment to the online environment. The continuous technology support ensures that faculty receive just-in-time assistance as well as group training. The online courses meet the same benchmarks and expectations as the residential courses. A well trained and supported online faculty is an important component in online education and encourages faculty success.

Seven mechanisms are used to ensure effectiveness:
1. assessment in Moodle
2. individual faculty consultations to discuss student assessment results
3. focus groups
4. blog readership
5. number of support calls the Technology Help Desk received
6. course offerings
7. faculty retention

Assessment in Moodle
Each section in the self-paced Moodle course is evaluated and faculty score high on the assignments and quizzes. Evaluation questions assess course content, technology, design, specific exercises from the lunch sessions, and assignments.

Individual faculty consultations to discuss student assessment results
At the conclusion of summer college, the online facilitator meets with each faculty member to assess their overall experience online, the information in their journal, the effectiveness of the training, and the feedback from the student assessment survey. The feedback from students and faculty has been positive and consistent for the past 7 years.

Focus groups
Faculty overwhelmingly indicated during the focus groups that learning to organize and section their content into varied modes of delivery, having the ability practice new strategies with technology support readily available, and having access to mentors contributed to overall teaching effectiveness and higher levels of confidence in the virtual environment. Interestingly, faculty also reported that the training prepared them “exceptionally well” for the experience and it changed the way they organize and facilitate their face-to-face course.

Blog readership
Elon relies on technology early adopters to provide inspiration among the faculty. The Technology Blog, blogs.elon.edu/technology/author/ccrabb/ features innovative faculty experimenting with technology discovered from individual consultations and interviews, as well as information about teaching online from current research and trends.

All lunch meetings are written into a post for future online faculty and those interested in adding online components to their face-to-face courses. Blog analytics and retweets indicate significant internal and external readership following these posts.

Course offerings
However, the true testament to the effectiveness and success is the continued expansion of summer college and the influx of new faculty applying to teach. In 2002 the pilot program had 12 online courses and in 2014 there were 65 offered.

Faculty retention
The effectiveness of the training program is indicated by the 99% return rate of instructors to the summer online program. Faculty members with full-time rank are eligible to teach in the program.

How does this practice relate to pillars?: 

Elon’s training program relates directly to Learning Effectiveness, specifically faculty development and course design.

To support faculty with varying technology skills and comfort levels as they learn to teach online, acquiring enhanced pedagogical skills for online learning environments through a supportive infrastructure is important. Teaching and Learning Technologies provide extensive technical and supportive resources along with a continuous consultation and evaluation process conducted by an instructional designer to ensure mastery of skills required for effective online teaching. The same instructional designer facilitates continuing conversation between and among faculty and academic support personnel.

A key component of the asynchronous learning is to help faculty understand what it means to be an online learner. The online discussions are built around planning, communication, evaluation, and course management. As the faculty progress through the Moodle course, they create components for their own courses and receive feedback from peers, mentors, and course facilitators. By sharing ideas and interaction with peers, mentoring, and discussions, faculty benefit from the sense of community and increased confidence.

Equipment necessary to implement Effective Practice: 

The room used for meetings is a part of the Teaching and Learning Technologies department. It is outfitted with whiteboards, a projector, laptop hook-up cables, audio, internet, and a Smartboard. Furniture is flexible allowing for multiple room configurations and lighting is adjustable for videography. Software used was free, i.e. Screen-cast-o-matic, Audacity, and PowerPoint.

Estimate the probable costs associated with this practice: 

Training faculty in a cohort of approximately 10 each annually is scalable and replicable. An initial investment of time and resources must be made to design and create content for the training program. Costs will vary depending on the availability of existing training content, staff experience creating content and the scope of the training course. All members of our team subsumed their online support efforts within their existing job responsibilities.

Criteria to teach online include the course was previously taught in a face-to-face environment and faculty have full-time rank. This significantly decreases resources needed for course design.

All lunch conversations talk about course development and modification. Estimated cost for eight lunch meetings, focus group conversations, and individual consultations over coffee is $1,000.

References, supporting documents: 

Boettcher, J. V., & Conrad, R. M. (2010). The Online Teaching Survival Guide: Simple and Practical Pedagogical Tips (1st ed.). San Francisco: Jossey Bass.
Ko, S., & Rossen, S. (2010). Teaching Online A Practical Guide. (3rd ed.). New York: Routledge.
Quality Matters. (2010). Quality Matters Program. Retrieved from https://www.qualitymatters.org/rubric
Wiggins, G., & McTighe, J. (2005). Understanding by Design (2nd ed.). Alexandria, VA: ASCD

Contact(s) for this Effective Practice
Effective Practice Contact: 
Dr. Cheryl Schauer-Crabb
Email this contact: 
ccrabb@elon.edu
Author Information
Author(s): 
Dr. Cheryl Schauer-Crabb
Institution(s) or Organization(s) Where EP Occurred: 
Elon University
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

With increased demand from students in 2001, to supplement standard curriculum course offerings on Elon University’s residential campus, the registrar and disciplines across campus discussed methods to best serve student needs online over the summer hiatus. The Summer College @Elon pilot program began in 2002 with 12 courses, and has grown to 65. Since its inception, courses have only been offered during the summer for Elon undergraduates. As the pilot flourished and became a pillar of Elon’s summer course offerings, the training program evolved to meet faculty needs.

This cost-effective, replicable and scalable multidimensional model for training faculty to teach online incorporates consultations, conversations, a self-paced Moodle course, peer mentoring, and technology support. The focus of our multidimensional model is on acquiring skills critical to teaching online. Estimated cost for a cohort of ten faculty is $1,000; this includes eight lunch meetings and individual coffee consultations.

Faculty members are supported by instructional designers, instructional technologists, multimedia developers, videographers, and e-learning specialists. The support team creates video and multimedia tools for course engagement and assessment. In addition, they provide guidance and train faculty in new technologies for course integration (i.e. screencasting, Wordpress, audio and video editing, PowerPoint).

Additionally, equipment is reserved for online faculty specifically for course enhancement (i.e. webcams, microphones, portable flip video kits, tablets, styli); other mechanisms on campus are in place for equipment purchase. This supportive infrastructure is a critical element of the training efforts by Elon University’s Teaching and Learning Technologies.

Seven mechanisms are used to ensure effectiveness:
1. assessment in the Moodle course
2. individual faculty meetings to discuss the training program and student assessment results
3. focus groups
4. blog readership
5. the number of support calls the Technology Help Desk received
6. course offerings
7. faculty retention at 99% rate of return to the program

Description of the Effective Practice
Description of the Effective Practice: 

With increased demand from students in 2001, to supplement standard curriculum course offerings on Elon University’s residential campus, the registrar and disciplines across campus discussed methods to best serve student needs online over the summer hiatus. The Summer College @Elon pilot program began in 2002 with 12 courses, and has grown to 65 courses in 2014. Since its inception, courses have only been offered during the summer for Elon undergraduates. As the pilot flourished and became a pillar of Elon’s summer course offerings, the training program evolved from two instructional designers consulting individually with each faculty member to a multidimensional process.

This cost-effective, replicable and scalable multidimensional model for training faculty to teach online incorporates consultations, conversations, a self-paced Moodle course, peer mentoring, and technology support. Faculty members who teach online for Elon University, regardless of previous online teaching experience, complete a four month blended training program and are required to have taught the course previously in a face-to-face environment. The focus of our multidimensional model for training is on acquiring skills critical to teaching online through individual guidance, discussion, application activities, pedagogical exploration, and technology mastery. There are five segments:

1. consultations
2. lunch conversations
3. a self-paced asynchronous Moodle course
4. peer mentoring
5. technology support

The pedagogy of teaching online and Moodle training are combined into this comprehensive program. A description of each segment follows.

Consultations
The role of the instructional designer is critical to multidimensional model to online training. The close working relationship between the instructional designer and faculty members begins nine months before the course is taught and reinforces the fact that support is available when needed.

Faculty interact with the instructional designer regularly and receive personalized feedback and suggestions regarding student evaluations, course modifications, feedback on assignments and modules, and the organizational framework of the course. Individual consultations also help to ensure consistency between courses and across programs since there is no standardization for courses on campus. The Quality Matters course design standards serve as a baseline for consultations and course modification.

Faculty are encouraged to keep a journal noting their thoughts regarding revisions for the following iterations of the course, hurdles, and how assignments or topics could be modified for future consultations with the instructional designer. This serves as a springboard for conversations and anecdotal evidence of transformative teaching.

Lunch conversations are modeled after Vanderbilt’s Center for Teaching Course Design Working Groups:
http://cft.vanderbilt.edu/guides-sub-pages/course-design/. They emulate Grant Wiggins and Jay McTighe’s book, Understanding by Design (2005). Backwards design and student-centered learning environments offer a powerful framework for designing courses with outcomes of “enduring understandings” for students and working backwards to design evidence of that understanding.

Sessions occur from February to May, a timeframe suggested by faculty as most convenient. Conversations explore in depth course development, design and assessment, and compares traditional face-to-face teaching with online pedagogy. These collaborative cross-discipline conversations provide faculty with continuing opportunities to learn from each other. These sessions are consistent in regards to content, requirement, deliverables and activities from year to year to ensure consistent outcomes.

Additionally, there is a practical and applicable emphasis in the topics covered which are: syllabus construction, effective quiz question development, best practices in design, implementation, assessment, activities modification, integration of web and library resources, classroom and time management, virtual guest speakers, effective faculty and student communication, multimedia integration, and rubrics. Each lunch conversation provides opportunities for the new-to-online faculty to showcase portions of their course under modification. There is no standardization on campus for courses, but each online course meets benchmark standards in technology, interaction, and assessment.

To celebrate faculty completing their first online course, the last lunch meeting showcases their deliverables. The online community is invited to the event, as are deans and departmental chairs.

Moodle complements the overall training experience with application and experimentation assignments. In this self-paced asynchronous site available February through May, in-depth discussions are facilitated on topics covered in the face-to-face conversations. The cohort contributes to dynamic discussions where they reflect on the relevance and application of the course material to their own teaching situations, and key issues about teaching and learning online based upon current literature. Teaching Online: A Practical Guide, by Ko and Rossen (2010) is one springboard for discussion in the Moodle forums.

The site has modules for novice, intermediate, and advanced Moodle users to progress through culminating in an assessment. The course illustrates, demonstrates, and discusses advanced teaching strategies, challenges, best practices, current research, and trends. Included are videos and examples sharing how to use specific techniques in online teaching featuring our faculty.

Throughout the final weeks of the training course, faculty are constructing their course in Moodle: writing discussion questions, constructing activities to assess student achievement aligned with the learning outcomes, embellishing with screencasts and videos, integrating multimedia developed by the training team, and finding online tools and resources to re-purpose. Completion of the Moodle training course assures faculty they have built a solid foundation for their own online course. Additionally, faculty have experienced learning online, contributed to discussions, viewed grades, submitted assignments, completed tests, and navigated the site successfully.

Peer mentoring and review parallel the pedagogical philosophies of online teaching and learning communities. Peer review of teaching is a widely accepted mechanism for promoting and assuring quality academic work.

The mentor selection process is subjective; the instructional designer identifies and invites two faculty members to serve as mentors to the online community with each new cohort. Invitations are extended to faculty based upon the online student feedback survey results (which are discussed individually in consultations with the faculty), creativeness of technology usage, and success of the course. Mentor responsibilities include:

• Contribute to the Moodle discussions
• Share resources and open online course for others to explore
• Facilitate one lunch meeting
• Serve as a point of contact for questions, advice about teaching, pedagogy, and Moodle
• Integrate a new technology tool into the course
• Review online courses and provide feedback

Peer mentoring is the conduit between the lunch conversations and Moodle course. Faculty are paired with an online mentor with whom they can confer and conduct course review. This component of the training program affords faculty the opportunity to think thoughtfully about the best way to transform their face-to-face strategies and practices to the online environment.

Technology support
Faculty are supported by instructional designers, instructional technologists, multimedia developers, videographers, and e-learning specialists. This supportive infrastructure is a critical element of the training efforts by Elon University’s Teaching and Learning Technologies. The support team creates video and multimedia tools for course engagement and assessment.

In addition, the support team trains faculty in new technologies for course integration outside of the five part training program in screencasting, Wordpress, audio and video editing, PowerPoint, and other tools as requested. Additionally, equipment is reserved for online faculty specifically for course modification (i.e. webcams, microphones, portable flip video kits, tablets, styli); other mechanisms on campus are in place for equipment purchase. This supportive infrastructure is a critical element of the training efforts by Teaching and Learning Technologies.

Faculty and students are supported through email, Moodle, and 24/7 assistance from the Technology Help Desk should any technologies issues arise during the summer online program. Faculty and students inquiries decreased significantly over the past seven years. This year there were less than 8 support calls placed to the Technology Help Desk as compared to over 40 seven years ago.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

At Elon the multidimensional model has been successful in preparing faculty for their first online teaching experience. The training provides faculty with a solid understanding of online education while simultaneously pairing them with experts to assist with course modification and the adjustment to the online environment. The continuous technology support ensures that faculty receive just-in-time assistance as well as group training. The online courses meet the same benchmarks and expectations as the residential courses. A well trained and supported online faculty is an important component in online education and encourages faculty success.

Seven mechanisms are used to ensure effectiveness:
1. assessment in Moodle
2. individual faculty consultations to discuss student assessment results
3. focus groups
4. blog readership
5. number of support calls the Technology Help Desk received
6. course offerings
7. faculty retention

Assessment in Moodle
Each section in the self-paced Moodle course is evaluated and faculty score high on the assignments and quizzes. Evaluation questions assess course content, technology, design, specific exercises from the lunch sessions, and assignments.

Individual faculty consultations to discuss student assessment results
At the conclusion of summer college, the online facilitator meets with each faculty member to assess their overall experience online, the information in their journal, the effectiveness of the training, and the feedback from the student assessment survey. The feedback from students and faculty has been positive and consistent for the past 7 years.

Focus groups
Faculty overwhelmingly indicated during the focus groups that learning to organize and section their content into varied modes of delivery, having the ability practice new strategies with technology support readily available, and having access to mentors contributed to overall teaching effectiveness and higher levels of confidence in the virtual environment. Interestingly, faculty also reported that the training prepared them “exceptionally well” for the experience and it changed the way they organize and facilitate their face-to-face course.

Blog readership
Elon relies on technology early adopters to provide inspiration among the faculty. The Technology Blog, blogs.elon.edu/technology/author/ccrabb/ features innovative faculty experimenting with technology discovered from individual consultations and interviews, as well as information about teaching online from current research and trends.

All lunch meetings are written into a post for future online faculty and those interested in adding online components to their face-to-face courses. Blog analytics and retweets indicate significant internal and external readership following these posts.

Course offerings
However, the true testament to the effectiveness and success is the continued expansion of summer college and the influx of new faculty applying to teach. In 2002 the pilot program had 12 online courses and in 2014 there were 65 offered.

Faculty retention
The effectiveness of the training program is indicated by the 99% return rate of instructors to the summer online program. Faculty members with full-time rank are eligible to teach in the program.

How does this practice relate to pillars?: 

Elon’s training program relates directly to Learning Effectiveness, specifically faculty development and course design.

To support faculty with varying technology skills and comfort levels as they learn to teach online, acquiring enhanced pedagogical skills for online learning environments through a supportive infrastructure is important. Teaching and Learning Technologies provide extensive technical and supportive resources along with a continuous consultation and evaluation process conducted by an instructional designer to ensure mastery of skills required for effective online teaching. The same instructional designer facilitates continuing conversation between and among faculty and academic support personnel.

A key component of the asynchronous learning is to help faculty understand what it means to be an online learner. The online discussions are built around planning, communication, evaluation, and course management. As the faculty progress through the Moodle course, they create components for their own courses and receive feedback from peers, mentors, and course facilitators. By sharing ideas and interaction with peers, mentoring, and discussions, faculty benefit from the sense of community and increased confidence.

Equipment necessary to implement Effective Practice: 

The room used for meetings is a part of the Teaching and Learning Technologies department. It is outfitted with whiteboards, a projector, laptop hook-up cables, audio, internet, and a Smartboard. Furniture is flexible allowing for multiple room configurations and lighting is adjustable for videography. Software used was free and includes Screen-cast-o-matic, Audacity, and PowerPoint.

Estimate the probable costs associated with this practice: 

Training faculty in a cohort of approximately 10 each annually is scalable and replicable. An initial investment of time and resources must be made to design and create content for the training program. Costs will vary depending on the availability of existing training content, staff experience creating content and the scope of the training course. All members of our team subsumed their online support efforts within their existing job responsibilities.

Criteria to teach online include the course was previously taught in a face-to-face environment and faculty have full-time rank. This significantly decreases resources needed for course design. All lunch conversations are about course development and modification. Estimated cost for eight lunch meetings, focus group conversations, and individual consultations over coffee is $1,000.

References, supporting documents: 

Boettcher, J. V., & Conrad, R. M. (2010). The Online Teaching Survival Guide: Simple and Practical Pedagogical Tips (1st ed.). San Francisco: Jossey Bass.
Ko, S., & Rossen, S. (2010). Teaching Online A Practical Guide. (3rd ed.). New York: Routledge.
Quality Matters. (2010). Quality Matters Program. Retrieved from https://www.qualitymatters.org/rubric
Wiggins, G., & McTighe, J. (2005). Understanding by Design (2nd ed.). Alexandria, VA: ASCD

Contact(s) for this Effective Practice
Effective Practice Contact: 
Dr. Cheryl Schauer-Crabb
Email this contact: 
ccrabb@elon.edu
Author Information
Author(s): 
Bevin Clare
Institution(s) or Organization(s) Where EP Occurred: 
Maryland University of Integrative Health
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

A peer-case-based learning course was converted to serve a blended learning clinical program. The course objectives drive a peer-to-peer learning experience with faculty serving as “guides” in the process rather than the experts. This delivery emphasizes the impact of student generated ideas and critical thinking and minimizes the common focus that there is a “right” way to approach individualized client care. Peer-led experiences are the focus of this course and the subsequent learning environment is intended to serve both students and faculty.

Description of the Effective Practice
Description of the Effective Practice: 

A peer-to-peer-case-based learning course was converted to serve a blended learning clinical program. The conversion of the program containing a team – taught case-based learning class was conducted between 2013 – 2014. Prior to conversion, this on-campus case-based learning course was offered for close to a decade with minimal adaptation. Upon conversion of our clinical program to a blended program, this course underwent significant alteration with contribution from previous faculty and students involved in this course.

The course was held online as student clinicians were also onsite for their clinical internships. It was held as a 12 week course with eight faculty offering specific weeks. In each week a detailed biomedical case was provided by the faculty member and a “lead-student” was assigned. The lead student would review the case prior to the start of each week and assign a detailed question about the case to each of her peers. By the middle of each week, each student will post a thorough response using adequate biomedical references to the forum as well as a response to a peer’s posting. In practice, the forums were busy areas of conversation more than the expected singular response.

Concurrently, in a separate forum, the lead student would also post their perceived assessment of the case as well as their clinical care goals (both long and short term). Each non-lead student would then post their own, highly specific, recommendations for the client including dietary, lifestyle, and therapeutic medicinal prescriptions. Peer critique and response posting was also required.

Faculty contributed to both forums, moderating the biomedical question responses and adding to, or correcting if needed, the student responses. They also commented on the goals and the plans and ultimately contributed their own strategies to the mix.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

Please see the attached supportive documentation for evidence as well as the following summary.

Overview
The conversion of the program containing a team – taught case-based learning class was conducted between 2013 – 2014. Prior to conversion, this on-campus case-based learning course was offered for close to a decade with minimal adaptation. Upon conversion of our clinical program to a blended program, this course underwent significant alteration with contribution from previous faculty and students involved in this course.
A survey of the students in their newly designed course was conducted, as was a survey of the faculty team (many new to any type of alternative delivery) who were able to compare (from their perspective) the new adaptation to the old format.

Student Survey Results
Nine of the twelve students participating in this course responded to the survey. Overall, all students reported learning “Significantly” or “Immensely” from 1) their peers, 2) In their own research, and 3) Writing answers to the peer-generated questions. No students reported learning “Minimally” or “Nothing” in any of the categories listed, although two students reported learning only “Somewhat” from the faculty (as a peer driven course this is partially expected and is consistent with previous feedback).
Being assigned a question from their peers was rated as an effective way to learn by 100% of the responders, and the assigning the questions to their peers was seen as an effective way to learn by 89% of them. Central to peer –based learning, 95% of students reported reading all or most of their peers submissions in all categories.
Overall, 100% of the students felt this was an “effective way to explore case studies”.

Faculty Survey
Six faculty who team-teach this course were surveyed on their perception of the course, often in relation to their prior experience with the F2F version. All faculty had previous taught in this course, generally for many years. For many of them, it was their first experience with alternative methods of delivery.

All faculty agreed that students were learning “Immensely” or “Significantly” from 1) their peers, 2) in their own research and 3) by writing their goals and plans. Answering peer-based questions was more controversial from faculty perception (although not from student perception).

Faculty were also surveyed about their own learning in the course and 100% of them reported that they themselves learned from the answers students reported to the peer-generated questions. More than half of them also reported learning from the questions from students, the goals and plans, and from their own research and preparation.
In the direct comparison of the online and F2F versions of the course, faculty overwhelmingly reported that he online version of this course was similar or better for all categories (except student engagement which was scored, on average, similar to the in class environment) particularly for biomedical understanding and peer-to-peer learning.

Overall, 100% of faculty felt that this was an effective way to explore case studies and 83% of them felt this was an overall “improved educational experience over the F2F version”.

How does this practice relate to pillars?: 

This practice relates to pillars in three areas.

Most importantly, the learning effectiveness of the peer-to-peer case based learning was critical. In comparison to the F2F environment students were more equally engaged and given adequate time to bring in significant outside resources. In the surveys conducted, students and faculty overwhelmingly reported high scored in learning effectiveness.

Faculty satisfaction in this was was reflected by the surprisingly high number of faculty reporting significant learning beyond their more standard preparation for the course. In the survey, 100% of faculty reported they learned from the replies to the peer-generated questions.

Lastly, student satisfaction was significant to this course, with 100% of students feeling this was an effective way to engage in case-based peer-to-peer learning.

Equipment necessary to implement Effective Practice: 

An LMS or comparable online discussion forum.

Estimate the probable costs associated with this practice: 

N/A

References, supporting documents: 

Please see the attached document for their student and faculty surveys.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Bevin Clare
Email this contact: 
bclare@muih.edu
Award Winner: 
2014 Sloan-C Effective Practice Award
Author Information
Author(s): 
Lee Woodham Digiovanni, Ed.D.
Author(s): 
Elke Leeds, Ph.D.
Institution(s) or Organization(s) Where EP Occurred: 
Kennesaw State University
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

This student connectivity based instructional practice anchors the use of readily available technology to manage the learning experience and create multiple opportunities for student engagement. Through the purposeful selection of familiar communication technology applications, a student-to-instructor and student-to-student engagement framework was created to increase the level of student participation and engagement while avoiding student technological learning frustration. This practice was piloted in a two-course Master’s level Educational Research sequence during Summer and Fall semesters in 2012. It was confirmed in the same course sequence during Summer and Fall semesters in 2013. Student feedback provided in module and course evaluations provided substantial support for the use of familiar communication/technology applications and satisfaction with course learning effectiveness.

Description of the Effective Practice
Description of the Effective Practice: 

Instructional technology offers many opportunities to enhance the online learning environment. With each new technological adoption, however, a learning curve is encountered by students and instructors that may offset the perceived benefit and impact the touted usefulness. Students new to online learning may be especially challenged as new technology is coupled with a new learning environment. Based on Knowlton’s (2000) framework, Puziferro & Shelton’s (2008) model for developing online courses, and supported by Bagozzi, Davis & Warchaw’s (1992) Technology Acceptance Model, an interactional communication practice was devised combining the use of multiple communication/technology applications and online teaching effectiveness principles. Aimed at adult learners, the practice was envisioned to counter-effect technology learning frustration by intentionally and deliberately employing simple, existing technology as the most effective way to support the learning experience. A reduction in technological frustration was intended to increase the levels and frequency of engagement, and thus reduce the need for student and instructor management of technology. The focus shifts to instructor-student communication, student satisfaction, learning effectiveness and faculty satisfaction with online teaching and learning. By choosing to leverage familiar technologies, the instructor was able to elevate ease-of-use while demonstrating usefulness through improved and more frequent instructor-student communication and engagement.

According to Heuer & King (2004), the online learning instructor encompasses multiple roles - planner, model, coach, facilitator, and communicator:

"To compensate for the reduced sensory cues, the instructor must be adept at communicating with students, employing the full capabilities of the technology, and responding to individualized learning styles and motivation. The asynchronous nature of online instruction that permits anytime and anywhere computer-mediated communication calls on the instructor to develop strategies to manage 24/7 communications, maintain momentum of the dialogue over time, and foster communities of learners (Heuer & King, 2004, p. 8)."

This shift in instructor perspective means that one must become very intentional about interacting with students as well as providing opportunities for student contact and in turn being responsive to students as they have needs or concerns. The innovation in this practice is found not from a radical new technology piece, but in meeting students where they are, utilizing technology that they are already accustomed to using.

Through a combination of the syllabus communicating expectations, course design utilizing similar structure throughout all modules to ensure comfort of the learner, and regular communication, students are scaffolded through the use of technologies as a means to support learning and communication in the course. Each of the communication elements in the student connectivity based practice are described below. Together, they provide a comprehensive solution to address the communication and engagement needs of online learners while maintaining focus on learning.

Twitter as a research tool - To help students get a sense of current conversations in the education community, students were shown through an instructor video how to use Twitter as a search engine to discover conversations and blog posts related to an area of interest for them. Students may have been familiar with Twitter as a way to communicate or receive information, but often were not familiar with leveraging it for professional learning.

Online office hours – Regular office hours achieved through Skype (although Google Hangout, or an online classroom could be used) allowed students to contact the professor with questions or concerns. Students were reminded of these office hours on a regular basis. Whenever the instructor was online attending to class matters, Skype was on, signaling to students that they were welcome to ask questions or check in.

Grading conferences – With pivotal assignments, scheduling grading conferences with students allowed an opportunity for the instructor to review work with the student, address any questions or concerns from either the instructor or the student, and provide direction to allow the student to move forward. These grading conferences were facilitated through a sign-up on Google Drive and carried out through Skype or phone, but could additionally be facilitated through other internet applications.

Learning Module Evaluations – Built into the online course were module evaluation surveys utilizing Google Forms to gather information about the student experience as the course was being taught. This evaluation allowed an anonymous look into the student experience. Posting regular updates about things learned from the evaluations, as well as making changes to the flow of the course as it unfolded based on feedback, allowed students to know that their voice counts in the quality of the instruction provided.

Text messages – Providing students with instructor cell number allowed students to contact the instructor when they needed an answer quicker than email might allow. As online courses are frequently structured with the expectation that students are on at varying times based on their schedules, the same can be expected of instructors who facilitate online courses. Through the course of this study, students proved to be very respectful and only called or text messaged when the instructor was not available online and they truly needed information to move forward.

Emails – the simple policy of checking and responding to messages in the online course environment several times a work day was employed to help ensure that any questions students had that might impede their learning progress were quickly addressed.

Discussion boards – not only did students post on discussion boards, the instructor regularly posted as well. The instructor utilized this tool as a way to let students know that they are on track, performing well, and/or to pose additional questions to change the focus of the discussion. Further, as common open source technology tools were utilized in ways not as familiar to students (for example, Twitter as a research tool), students had the opportunity to discuss what they learned and reflect on ways that these technology tools could be utilized in their own practice.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

A combination of module and course evaluations were used to evaluate the effectiveness of the practice and the corresponding satisfaction with course. The module evaluations served a two-fold purpose: to determine the effectiveness of the practice and to inform the need for course modification on a rolling basis.

In the 2012 pilot, 81% of students provided module evaluations in the first semester, and 55% of students provided information during the second semester. A similar response rate was experienced in the 2013 study, with 85% of students providing module evaluations in the first semester and 60% providing module evaluations during the second. While both sets of module evaluations showed a decreased level of student feedback, quantitative data provided by students remained consistent in the high degree of satisfaction regarding the efficacy of the instructional materials and assignments. Qualitative data additionally pointed to efficacy of using open source and familiar technology:

* I am amazed at how valuable Twitter can be for educational purposes. I plan on using it regularly as a tool for finding ideas to use in my teaching.

* What I “took away” from this module was the effective use of Twitter as a research tool.
I have never thought of using Twitter for research until this course. There were so many different hot topics about education with so many varying opinions. It was interesting to read what people are passionate about in education.

*Thanks to the phone conference I felt very confident as I was submitting both of my requests.

* I REALLY liked having the Skype date. Very helpful!!”

* I felt that it was extremely helpful to have the conference call to make sure we are prepared and could ask questions.

Based on the very favorable results of the module evaluations in the pilot semesters, very few course modifications were made other than a few regarding course organization. Students repeatedly noted in the qualitative data gathered by course and module evaluations that the level of communication and feedback given by the instructor was one of the most important parts of their experience. Students additionally reported that the use of familiar technology was very effective in their learning experience and comfort. Several students offered additional feedback on the student connectivity practice:

*She welcomes us to call, text, or Skype with questions or concerns as we work through our projects, data, or research.

*These various tools for communication make the online experience much more personal.

*...courses are geared to be friendly to the working professional. She always responds to your questions in a timely manner and is willing to work with her students.

Additionally, quantitative results from the course evaluations were compared for the two iterations of this study. The two-course sequence was taught by the same instructor over the same period of time using a consistent instructional delivery method. Course evaluations over the pilot study showed students rating instructor effectiveness at 3.69/4.0 (65% of students responding) in summer 2012 and 3.9/4.0 during Fall 2012 (56% of students responding). Repeating the practice in summer and fall 2013, students rated instructor effectiveness at 3.78/4.0 (53% of students responding) and 3.86/4.0 (56% of students responding) in the two sections of the first course, and 4.0/4.0 in Fall 2013 (30% of students responding). This data leads one to believe that the overall student satisfaction was high in part due to the course design and intentional use of familiar and open source technology applications for course communication.

The results of two semester pilot strongly influenced the repeated use of the practice and resulted in the mainstreamed use across all course sections. Through modeling these tools, documenting student satisfaction and performance in the degree program, and sharing course design and instructor practice, other instructors have gone to this instructor for guidance in setting up their courses, as well as adjusted their teaching by utilizing the technology/communication framework for their courses. Most of the courses in this degree program now utilize much of this framework as a means of supporting students. Further, these practices have been highlighted in online course development workshops offered on campus using this instructor’s courses as a model of effective online course design, and in TalonTips, a publication from Kennesaw State University recognizing excellence in Technology Enhanced Teaching and Learning (http://www.kennesaw.edu/dlc/TalonTips/Volume/TalonTipsVol5Issue1.pdf).

How does this practice relate to pillars?: 

This practice creates a modern approach by going ‘back to the basics’ of technology acceptance. The instructor leverages easy to use, already familiar open source communication technology applications to create a dynamic and highly engaged learning environment. The technology is readily accessible and the practice easily replicated. Student learning effectiveness is enhanced when the focus is placed directly on learning and not the tools used to facilitate learning. Student satisfaction at the course and module level was high, and the practice can be deployed across any discipline. Faculty satisfaction for this study was very high, as the focus on the course was on learning rather than any difficulty from the technology itself, or issues with course organization that created questions regarding the course rather than the content in relation to learning. This practice was scaled to all online faculty by sharing it through workshops and a faculty publication, thus identifying and implementing effective practices.

Equipment necessary to implement Effective Practice: 

As these applications are typically open source technologies, there would be no additional equipment needed other than what is typically used to teach online courses.

Estimate the probable costs associated with this practice: 

As these applications are typically open source technologies, there would be no additional expense incurred other than normal operating expenses to teach online courses.

References, supporting documents: 

Anderson, T., Rourke, L., Garrison, D.R., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5 (2).

Bagozzi, R.P., Davis, F.D., Warshaw, P.R. (1992). Development and test of a theory of technological learning and usage. Human Relations, 45 (7), 660-686.

Digiovanni, L.W. (2013). Connecting with online students. Talon Tips 5(1), 3. Retrieved from http://www.kennesaw.edu/dlc/TalonTips/Volume/TalonTipsVol5Issue1.pdf

Heuer, B. P., & King, K. P. (2004). Leading the band: The role of the instructor in online
learning for educators. The Journal of Interactive Online Learning, 3(1), 1-11. Retrieved from http://olms.cte.jhu.edu/olms/data/resource/5952/The%20Role%20of%20the%20...

Knowlton, D. S. (2000), A Theoretical Framework for the Online Classroom: A Defense and Delineation of a Student-Centered Pedagogy. New Directions for Teaching and Learning, 2000: 5–14. doi: 10.1002/tl.841

Kuyath, S. J. & Witner, S. J. (2006). Distance education communications: The social presence and media richness of instant messaging. Journal of Asynchronous Learning Networks, 10(4), 67- 81.

Lewis, C.C. & Abdul-Hamid, H. (2006). Implementing effective online teaching practice: Voices of exemplary faculty. Innovative Higher Education, 31(2), 83-98. doi: 10.1007/s10755-006-9010-z

Papachristos, D.D., Alafodimos, N.N., Arvanitis, K.K., Vassilakis, K.K., Kalogiannakis, M.M., Kikillias, P.P., 7 Zafeiri, E.E. (2010). An educational model for asynchronous E-learning: A case study in higher technology education. International Journal of Advanced Corporate Learning, 3(1), 32-26. doi:10.3991/ijac.v3i1.987

Puzziferro, M., Shelton, K. (2008). A Model for Developing High-Quality Online Courses: Integrating a System Approach with Learning Theory. Journal of Asynchronous Learning Networks, 12(3-4). Retrieved from http://files.eric.ed.gov/fulltext/EJ837519.pdf

Contact(s) for this Effective Practice
Effective Practice Contact: 
Lee Woodham Digiovanni
Email this contact: 
ldigiova@kennesaw.edu
Effective Practice Contact 2: 
eleeds@kennesaw.edu
Award Winner: 
2014 Sloan-C Effective Practice Award
Author Information
Author(s): 
Wendy Cowan
Author(s): 
Mark Gale
Institution(s) or Organization(s) Where EP Occurred: 
Athens State University
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

Research in online student retention suggests that both time and relationships play a critical role in student persistence. Providing courses online does address convenience as it relates to student time constraints, but once inside the online classroom, it’s imperative that instructors find creative ways to deliver instruction that leads to student engagement. Students become more engaged when relationships are formed – with both the instructor and peers. Virtual classroom sessions, while seeming to be one solution for forming relationships, conflict with the convenience of taking an online class. To counteract this inconvenience, instructors teaching online and blended sections of the same course decided to create a learning community that offered multiple times and dates for virtual class sessions. The results have led to increased satisfaction and engagement for both students and faculty.

Description of the Effective Practice
Description of the Effective Practice: 

Across universities, each semester there are some courses that are offered and taught by multiple instructors. For example, English 101 could possibly be offered in the schedule across 10, 20 and even more sections. Some of these sections are offered in online format.
In the Athens State College of Education, we offer three courses that are taken by all College of Education majors – Foundations of Education I, Foundations of Education II and Technology and Media for Educators. Each semester we offer at least 10+ sections of each of these courses, with over half of them offered in a blended or online format.
In an effort to help establish positive relationships in these online courses, we implemented weekly virtual classroom sessions, which isn’t a new idea. But because we have multiple instructors teaching sections of the same course, we went a step further and created a community calendar where each instructor posts the date, time, topic and entry URL to his/her virtual classroom sessions (See supporting documents). Instructors are encouraged to schedule their sessions at different times/days throughout each week so that students have many options for attending live sessions.
Instructors conduct virtual sessions at the scheduled time/date weekly and record the session. Archived sessions are made available inside courses for students who are unable to attend the live sessions. Upon completion of each session, instructors ask for the names of any “visiting” students. Visiting student’s names are then sent to the other instructors so that students are given credit for attendance.
Inside each course, students are provided with a link to the community calendar and are informed that they may attend any session(s) offered. Students who were unable to attend are required to watch and summarize archived sessions.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

While the initial goal was to improve relationships within the course, the results have far exceeded our imaginings. Students reported that they not only appreciate the availability of the live sessions, but have also stated that the sessions help them feel like they are in a “real classroom.” Students have also reported that they appreciate the ability to choose session times/dates that best meet their needs. This evidence of effectiveness was expected (See supporting documents).
Evidence of effectiveness that was not anticipated is the instructor’s perceptions of teaching and learning effectiveness and overall satisfaction. Prior to initiating the across-section virtual classroom sessions, instructors completed a pretest measuring faculty satisfaction. Upon completion of the first semester of implementation instructors completed the posttest. A review of the data indicates that faculty are more satisfied with their role in the course following the semester of weekly virtual classroom sessions (See supporting documents).

How does this practice relate to pillars?: 

Learning effectiveness: Instructors felt more empowered as a result of this effective practice. Data suggests that instructors perceived an increased contribution to student learning. One hundred percent of instructors surveyed felt that students had a valuable learning experience due to the instructor’s role in the class. When comparing the amount of content able to be taught between an online/blended course and a traditional course, most instructors (73%) reported that they could now teach the same or more content. Seventy six percent of the instructors reported that they were more satisfied with their online/blended course the semester following the virtual classroom implementation. Instructors (85.7%) believed that the virtual classroom sessions improved student success (See supporting documents).
Faculty satisfaction: Posttest data from the virtual classroom implementation suggests that faculty are pleased with teaching online/blended courses. On the pretest survey 86.7% of instructors reported that they were very satisfied teaching an online/blended course. Following the virtual classroom sessions implementation 76.9% of instructors reported being more satisfied than they were the previous semester. Considering that that most instructors had already reported being very satisfied, this is a very indicative finding regarding faculty satisfaction (See supporting documents).
Student satisfaction: Student surveys indicate that students are satisfied with the availability of the virtual classroom sessions. Approximately 55% of students reported that the virtual classroom sessions were beneficial (See supporting document).

Equipment necessary to implement Effective Practice: 

The equipment we used to implement this effective practice was Blackboard Collaborate and Blackboard Wimba, which are virtual classroom platforms. Google Docs was used for the community calendars.

Estimate the probable costs associated with this practice: 

While neither Wimba nor Collaborate are free, there are other virtual classroom options that are free and low cost. For example, Google Hangouts could be used for virtual classroom sessions.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Wendy Cowan
Email this contact: 
wendy.cowan@athens.edu
Effective Practice Contact 2: 
Mark Gale
Email contact 2: 
Mark.Gale@athens.edu
Award Winner: 
2014 Sloan-C Effective Practice Award
Author Information
Author(s): 
Deborah A. Raines, PhD, EdS, RN, ANEF
Institution(s) or Organization(s) Where EP Occurred: 
University at Buffalo: SUNY
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

Online technologies have removed the barriers of time, cost and location from the learning activities and experiences available to students. Virtual field trips allow learners to engage with and to learn about authentic artifacts and to explore places important to their discipline of study and consistent with their individual learning needs. During a virtual field trip students can be guided through museums, historical documents, national monuments and agencies or organizations specific to the course content. A virtual field trip can also involve attending an artistic performance and connecting with a leader in the field of study. As a nurse educator, my students and I have visited the National Institute of Nursing Research, the National Library of Medicine, the Grave sites of famous nurses, The National Patient Safety Foundation and many other places to enhance understanding and application of course content. These field trips bring the real-world perspective to concepts discussed in course text-books as well as provide a national and global perspective to the material being studied. In addition, learners often find tools and resources that are useful in the academic studies as well as in their professional practice.

The opportunities for learning on a virtual field trip are limited only by the creativity of the leader (the faculty) and the engagement of the traveler (the learner). Once the destination is selected the virtual field trip needs to be planned: goals and objective, a guide for exploration and specific outcomes and souvenirs to be gathered by the traveler and then shared with the entire class. A discussion board, wiki or blog can be used to bring everyone together to share their individual journey.

Description of the Effective Practice
Description of the Effective Practice: 

The virtual field trip enables learners to visit and explore destinations relevant to course concepts and aligned with course learning objective. The virtual field trip brings real-life experience to the application and understanding of course concepts. I frequently use a virtual field trip at the beginning of a course to provide a “big picture” of the content we will be studying during the course. While each virtual field trip is different, the following are general steps in the creation of a virtual field trip.
• Identify destinations consistent with the objectives of the course and the learning needs of the students. Everyone may visit the same destination or a choice of destinations may be presented to the learner. Whether there is a single destination or a choice of destination is determined by the desired learning outcome.
• The leader/instructor must visit each destination, during the development process, to identify which areas, activities or resources the visitor (learner) will be directed to explore.
• Script an introductory statement to engage and interest the student in the activity. Including a map with the destination highlighted or a statement of the importance of the destination to the field of study.
• Provide a general focus or what the learner needs to achieve during their trip.
• Give the URL to the destination.
• Provide a guided tour and step-by-step instruction to get the visitor to the portion of the destination web-site that will facilitate achievement of the learning objective.
• Give clear details of what visitor need to look for, collect or observe while at the destination.
• Encourage learners to gather souvenirs, pictures and other memories/interesting findings from their trip to share with the class.
• Create an activity where each student shares and discusses their trip and what they learned with other students in the class. A discussion board, wiki, blog or voice-thread are all mechanisms for this sharing activity.
• Make it fun – include graphic and color in the field trip announcement.
• Consider sharing your trip to the destination as an example for the students as well as to demonstrate your involvement and participation in the field trip.
An example of a virtual trip used in a recent course is included as an attachment below.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

Student response to the virtual field trip activity has been overwhelmingly positive. In the posting sharing experiences and learning during the virtual field trip it is evident than many students really explored their destination and accessed documents or viewed items relevant to their individual nursing practice settings. In each group, a couple of students have actually reached out to ask questions or seek additional information from staff at the site being visited. This additional activity that was initiated by the student is evidence of their engagement in the activity and the effectiveness of the virtual field trip in encouraging them to seek knowledge to fulfill their learning needs. Students share photos from their destination, links to tool-boxes or white papers and some students even visit the gift shop and share virtual gifts for other class members

Over the past 2 years, I have used a total of 5 virtual field trips in five different courses. As part of the end of course survey, students have been asked to rate different learning activities used in the course. The aggregate rating for the virtual field trip activity, based on these five courses is 94% excellent and 6% very good. A total of 148 students responded to the end of course surveys and no one has rated the field trip activity as anything but excellent or very good. Qualitative comments about the virtual field trip activity include word such as “fun”, “great opportunity to learn about places I will probably never have the chance to visit in person”, “I loved visiting the AHCQR”, “I got great resources that I now use at work during the virtual field trip”, and “consider adding more field trips…it was great!”

Finally a group of colleagues and I conducted a program evaluation study to explore if students’ attitudes and knowledge about patient safety change as a result of the course, which included a virtual field trip. The research showed a significant increase in knowledge and attitude as measured before and after the course. A copy of the poster, which was presented at the AACN (American Association of Colleges of Nursing) conference in November 2013, with the research findings is included as an attachment below.

How does this practice relate to pillars?: 

Learning effectiveness: The virtual field trip capitalizes on one of the greatest advantages of the online learning technologies: the lack of barriers to exploration. The opportunity to visit locations outside the learner’s geographic home without the expense of plane tickets, hotels and time away from family and work commitments is not feasible in the traditional classroom. Designing virtual field trip which acknowledge the unique characteristics and interests of each learner and that grant the learner access to resource that are useful in the course and beyond, culminates in a learning activity that is effective in achieving course objectives, promoting disciplinary socialization and is satisfying and enjoyable to the learner.

Student satisfaction: Student comments related to the virtual field trip activity are overwhelmingly positive. Students appreciate the change from reading text document, viewing videos and responding to discussion probes. They also enjoy the freedom to explore specific aspects of a destination specific to their individual interests and career goals. For example at the National Institute for Nursing Research or the National Patient Safety Foundation a nurse practicing in pediatrics will explore different documents and tools than a nurse who practices in hospice or in an adult cardiology setting. Over time, it has been observed that a number of students re-visit the destination of their virtual field trip and reference document or use tools from the site in assignment or projects later in the course.

Equipment necessary to implement Effective Practice: 

The virtual field trip can be designed and implemented with no special software or equipment. All that is needed is access to the internet, the web-address/URL of the place to be visited, a discussion board, wiki or blog where students can share their travel experience and findings and some creativity to structure a fun and educational activity.

Estimate the probable costs associated with this practice: 

Since most universities have access to the internet for students and faculty, the start-up costs are minimal. The major investment is the time of the faculty in developing the learning activity and exploring the proposed destination to assure that the learning objectives are achievable.

References, supporting documents: 

Supporting document attached:
--Virtual_Field_trip.PDF -- an example of a virtual field trip.
--AACN Poster2.PDF -- finding of a program evaluation study related to change in attitude and knowledge following a course which included a virtual field trip.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Deborah A. Raines
Email this contact: 
draines@buffalo.edu
Award Winner: 
2014 Sloan-C Effective Practice Award
Author Information
Author(s): 
Rick Lumadue, PhD
Author(s): 
Rusty Waller, PhD
Institution(s) or Organization(s) Where EP Occurred: 
Texas A&M University-Commerce
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

Programmatic student-learning outcomes of an online master’s degree program at a regional University in Texas were assessed in this study. An innovative use of emerging technology provided a platform for this study. The Astin Model provided the framework for the evaluation. This study has provided a model for conducting well-informed, instructional and programmatic assessments of student-learning outcomes. The results of this study demonstrated that emerging technology can provide a platform for students to both showcase and preserve their ability to meet programmatic student-learning outcomes.

Description of the Effective Practice
Description of the Effective Practice: 

This online master’s degree program is taught using a fully interactive online format in a primarily asynchronous delivery model. Asynchronous activities used in the program included: threaded discussion, video and audio presentations, written lecture linked to video and audio presentations embedded into the course management system, Voicethreads, faculty developed MERLOT web pages created using the MERLOT Content Builder, e-Textbooks, etc.
The Astin Model (1993) provided a framework for this assessment. In the Astin Model, quality education not only reaches established benchmarks but also is founded upon the ability to transition students from where they are to reach intended competencies. An innovative use of MERLOT Content Builder combined with emerging technology provided a means for assessing the seven student-learning outcomes in an online master’s program at a regional university in Texas.
Two full-time faculty and one adjunct faculty used rubrics to evaluate each of the programmatic student-learning outcomes by assessing a random sample of student assignments from courses.
The goal of this study was to help students reach the intended learning outcomes for metacognition, digital fluency, communication, cultural fluency, global fluency, servant leadership, and commitment to life-long learning. Definitions of these learning outcomes are provided here. Students will evidence metacognition by demonstrating the knowledge and skills for designing, developing, and evaluating personal strategies for learning and leading. Students will evidence digital fluency in the adoption and integration of appropriate technologies into digital presentations. Students will be able to communicate ideas and content to actively engage participants. Students will evidence understanding of generational and cultural learning styles. Students will develop instructional materials appropriate for a global perspective. Students will practice the principles of servant leadership as espoused by Robert Greenleaf in his work titled, The Leader as Servant (1984). According to Greenleaf, “The servant-leader is servant first. It begins with the natural feeling that one wants to serve first. Then conscious choice brings one to aspire to lead. Students will evidence a commitment to lifelong learning in the production and evaluation of learning materials.
Digital education presents many challenges. Barnett-Queen, Blair, and Merrick (2005) identified perceived strengths and weaknesses of online discussion groups and subsequent instructional activities. Programmatic assessment is required for all institutions accredited by the Council of Higher Education Accreditation or the US Department of Education. Walvoord (2003) indicated that good assessment should focus on maximizing student performance. The following questions rise to the forefront: (1) Have graduates mastered programmatic expectations; (2) What relationships exist between student performance and other factors; and (3) How can faculty improve the program based upon the analysis of student performance. Walvoord further stresses the importance of direct assessment in determining student performance. Indirect measures may provide evidence of student-learning, but direct assessment is widely viewed as more valid and reliable.
Brandon, Young, Shavelson, Jones, Ayala, Ruiz-Primo, and Yin (2008) developed a model for embedded formative assessment. The model was collaborative and stressed embedded assessment. Their study stressed the difficulties associated with broad-based collaboration given the difficulties of formally identifying partners and spanning large geographic distances. Price and Randall (2008) demonstrated the importance of embedded direct assessment in lieu of indirect assessment. Their research revealed a lack of correlational fit between indirect and direct assessment of the same aspect of student-learning with the same course in a pre- and post-test design. They documented a difference between student perceived knowledge and actual knowledge. These findings further underscore the importance of direct assessment of student-learning. Walvoord’s (2003) findings further indicated the need for embedded direct assessment of student-learning owned and supported by those who will implement the change. Those implementing change would include program faculty and students.
Gardner (2007) found that education has long wrestled with defining and assessing life-long learning. Though loosely defined as the continued educational growth of the individual, lifelong learning is rapidly rising to the forefront of 21st century education to assume a more prominent place than that held in the 20th century. Brooner (2002) described the difficulty of assessing the intention to pursue learning beyond the completion of a program. Intention and subsequent performance are affected by many different factors including, but not limited to, normative beliefs and motivation. Educational programs have often been encouraged to avoid assessment of behavior beyond the point of graduation as such behavior as been viewed as beyond the control of program educators (Walvoord, 2003). The question arises as to the importance of future behavior as an indicator of current learning.
Astin (1993) pointed out that educators are inclined to avoid assessment of the affective domain viewing such as too value laden. Accordingly, the cognitive domain became the defacto assessment area though affective assessment more closely paralleled the stated aims and goals of most institutions of higher education. The avoidance of assessment in the affective domain is well documented by Astin. The advent of social media tools coupled with e-portfolios offers some intriguing possibilities in regard to assessment in the affective behavioral domain. Astin pointed out that a change in the affective domain should translate into changed behavior.
Secolsky and Wentland (2010) found many advantages to portfolio assessment that transcend regular assessment practices by providing a glimpse into non-structured behavioral activities. Behavior beyond the classroom can be captured and documented within a properly designed portfolio. Behavior that has not been directly observed by the teacher can be measured in light of portfolio submissions via a broad collection of relevant and targeted information. Established performance criterion can be assessed to measure student-learning and determine specific areas for programmatic improvement. Though Secolsky and Wentland point out that reliability and validity concerns still exist with portfolio measurement, they concur that portfolio assessment potentially gauges authentic student performance outside the educational environment. With the development of a portfolio transportable beyond program enrollment and across the life experience the opportunity exists to assess the impact of the instructional experience upon real time student performance. Evaluation of life-long portfolios promises to provide meaningful insight into the real life impact of the educational experience. Astin (1993) viewed changed behavior over time as the real evidence of affective enlightenment.
An interesting finding from this study was the creative manner in which some of the students layered or nested other web 2.0 technologies into their MERLOT web pages. Examples of layering or nesting included embedded student developed Voicethread presentations, embedded open-ended discussion Voicethreads used to promote participation and feedback, embedded YouTube Videos, embedded Prezis and the like.
The integration of MERLOT GRAPE Camp peer review training into this Master Degree Program has provided an additional platform for further research to be conducted relative to the assessment of all seven of the programmatic learning outcomes of the program. For example, metacognition may be assessed as it relates to MERLOT’S peer-reviewers serving as content expert in assessing materials that pertain to one’s field. Communication may be assessed through interaction with peers and peer-reviews. Digital fluency is obviously what is required to contribute to MERLOT. Cultural Fluency may be demonstrated through peer reviewing submissions of MERLOT’s international community of partners. Global Fluency may be measured through the development and contribution of appropriate content for use in a global community of learners. Servant Leadership is the motto of MERLOT, “Give a Gift not a Burden!” (Gerry Hanley, 2010). Finally, the development of students into lifelong learners will help to establish the identity of the program. Student performance outside of the program is one of the best measures of student-learning and the MERLOT Content Builder along with MERLOT peer-reviews is a tremendous platform for measuring student-learning outcomes.
Life long learning may be assessed by current and former students’ contributions of materials to MERLOT and by those providing peer reviews of materials contributed to MERLOT. As a benefit of being a MERLOT partner, the dashboard report provides information on contributions made by members of the partner organization. Contributions and/or peer reviews completed by students who have graduated from the program will be recorded in the dashboard report. This is a tremendous tool to measure the commitment to life long learning. Ultimately, this study has demonstrated that the MERLOT platform combined with emerging technology are integral in assessing student-learning outcomes in an online master’s program at a regional University in Texas. Other online degree programs should seriously consider the MERLOT Content Builder’s potential to help them assess student-learning outcomes.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

The Online Master of Science in Global eLearning equips specialists in education for practice in public education, private education, business, industry, and non-profit organizations. Learning and technology are intertwined as we develop the next generation of enhanced training, development, and teaching to engage learners with key components of instructional technology. Technology provides access to all forms of education and this program will teach educators how to implement technology across curricula and classrooms of all kinds. With a blend of theory and technical skills, this program will prepare teachers and corporate trainers alike.

Metacognition – Students will demonstrate the knowledge and skills for designing, developing, and evaluating personal strategies for learning and leading.
5 journal entries will be selected at random from a course offered in Fall 2012. These will be evaluated by the fulltime bachelor’s and master’s faculty using the Global eLearning Metacognition rubric. Scores will be deemed acceptable at an average of 4.0 or higher on a 5 point scale in each of the areas of context & meaning, personal response, personal reflection, and interpretive skills.

The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Context & Meaning 4.27
Personal Response 4.13
Personal Reflection 4.40
Interpretive Skills 4.47

All standards were met.
Though all standards were met, the faculty noted that the personal response section scored the lowest at 4.13. Accordingly the course, EDUC 595 Research Methodology, was expanded to include more opportunities for students to provide self and peer-evaluation feedback on projects and assignments. Two assessments were recommended for AY 2013-2014. We will assess one course in the Fall and one course in the Spring.

Communication – Students will communicate ideas and content to actively engage participants.
5 student digital presentations will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Assessment of Digital Student Presentation Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 42 on a 50 point scale in each of the five areas of purpose, organization, content, language, and voice & tone. The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Purpose 45.33
Organization 46.67
Content 46.00
Language 44.00
Voice & Tone 44.67
Technology 45.33

All standards were met. Though, all standards were met Faculty noted that Language scored the lowest. The faculty decided to conduct two assessments for the next cycle. One will be done in the fall and one in the spring.

The faculty modified an assignment in EDUC 515 Intercultural Education to provide students an opportunity to develop their language skills on a project to provided heightened sensitivity to language that might be offensive in other cultures.

Two assessments were recommended for AY 2013-2014.

Digital Fluency - Students will evidence digital fluency in the adoption and integration of appropriate technologies into digital presentations.
5 student digital presentations will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Assessment of Digital Student Presentation Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 45 on a 50 point scale in the area of technology.

The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Technology 45.33

The standard was met.
The faculty noted that the students tended to use more familiar software and avoid the utilization of emerging software. Accordingly, EDUC 510 Utilizing Effective Instructional Technology was modified to include requirements for the utilization of at least one Web 2.0 software program to complete an assignment.

The faculty will conduct two evaluations in AY 2013-2014.

Cultural Fluency – Students will evidence understanding of generational and cultural learning styles.

5 student digital presentations will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Cultural Fluency Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 3.0 on a 4 point scale in the areas of knowledge & comprehension, analysis & synthesis, and evaluation.

The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Knowledge & Comprehension 3.53
Analysis & Synthesis 3.07
Evaluation 3.67

The standard was met. The faculty noted that analysis and synthesis scored lowest. Accordingly the curriculum for EDUC 552 Global Fluency was expanded to include group projects on the education system of other cultures.

The faculty will also conduct two evaluations in AY 2013-2014.

Global Fluency – Students will develop instructional materials appropriate for a global perspective.

5 group project entries will be selected at random from a course offered in Summer 2012. These will be evaluated by the fulltime bachelor’s and master’s faculty using the Global eLearning Global Fluency Rubric. Scores will be deemed acceptable at an average of 2.8 or higher on a 4 point scale in each of the areas of knowledge & comprehension, application, and evaluation.

The assessment was conducted by two fulltime and one external faculty on July 22, 2013. The external was added to strengthen the review. Results were as follows:

Knowledge & Comprehension 2.87
Application 3.00
Evaluation 2.87

The standards were met.

Faculty found student performance in this area to be adequate. Some challenges were noted in the use of stereotypes in identifying people from other cultures. For example, a student made a comment on. EDUC 515 Intercultural Education will be expanded to include a project in which students will interview someone from a different culture to discover differing worldviews of other cultures and share these findings in a forum with classmates.

Servant Leadership – Students will practice the principles of servant leadership as espoused by Robert Greenleaf.

5 student group project self-assessment packets will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Servant Leadership Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 40 on a 50 point scale in each of the five areas of purpose, organization, content, language, and voice & tone.

The assessment was conducted by two fulltime and one external faculty on July 22, 2013. The external was added to strengthen the review. Results were as follows:

Servant Leadership 41.33
Strategic Insight & Agility 39.33
Building Effective Teams & Communities 44.00
Ethical Formation & Decision Making 43.33

The standard was NOT met for Strategic Insight & Agility.

Faculty noted problems in the effective feedback of peer-evaluation assignment. Accordingly, the group peer assessment process has been expanded to include MERLOT GRAPE Camp to provide training on conducting peer-evaluations. All students will be required to complete MERLOT GRAPE Camp training. These changes will be enacted in all new course sections.

Commitment to Life-Long Learning – Students will evidence a commitment to lifelong learning in the production and evaluation of learning materials. 5 portfolio entries will be selected at random from a course in Fall 2012. These will be evaluated by the fulltime bachelor’s and master’s faculty using the Global eLearning Commitment to Life-long Learning rubric. Scores will be deemed acceptable at an average of 3.0 or higher on a 4 point scale in each of the areas of production of educational materials, publications, presentations, including personal response, personal evaluation, and interpretive skills.
The assessment was conducted by two fulltime and one external faculty on July 22, 2013. The external was added to strengthen the review. Results were as follows:

MERLOT Web Pages 3.4
Presentations 3.8
Peer Evaluations 3.60

The standard was met. Though, all standards were met Faculty noted that MERLOT Web pages scored the lowest. The faculty decided to conduct two assessments for the next cycle. One will be done in the fall and one in the spring.

The faculty modified an assignment in EDUC 528 Intro. to Presentation Design to make the MERLOT Web page a requirement rather than an option

Two assessments were recommended for AY 2013-2014.

How does this practice relate to pillars?: 

1) Leveraging MERLOT Content Builder with emerging technology to assess programmatic student learning outcomes is scalable because it encourages more online instructors and instructional designers to consider integrating this model to measure the effectiveness of assignments in meeting the goals for Institutional effectiveness planning.

2) Increases access by providing open access using MERLOT’S Content Builder combined with emerging technology to showcase learning outcomes for students and faculty to assess regardless of location as long as they have an internet connection.

3) Improves faculty satisfaction by providing faculty with open access to evaluate student assignments to assess programmatic student learning outcomes for Institutional effectiveness planning.
Since this model was used to complete a recent Institutional Effectiveness Plan for an online master’s degree program in preparation for a regional accreditation visit other instructors can easily replicate this model to evaluate their programs.

4) Improves learning effectiveness by providing instructors with effective online strategies that are supported by empirical data from assessments of random samples of student assignments .

5) Promotes student satisfaction by providing valuable opportunities for interaction with their instructor and other students. Students work together on group projects for both synchronous and asynchronous presentations. Students are also assigned group and individual projects to evaluate the work of their peers and provide feedback. Rubrics are embedded in the grade book of the LMS to evaluate student assignments. Also, an evaluation tool of the programmatic student-learning outcome that is tied to the assignment is also included in the grade book to assess the level of student understanding. Students regularly comment about how valuable these practices are to their learning experience.

Equipment necessary to implement Effective Practice: 

The only aspect completely necessary is an internet connection and an LMS. In our program, the students also used Camtasia, Quicktime and Captivate for creating videos to complete some of their individual projects. Group projects were completed using Google+ Hangouts, Skype, Voice Thread and Adobe Connect. Students also created MERLOT web pages, MDL 2 Courses and digital portfolios.

Some of the tools we used have costs associated with them. Here is a list of some them:

• Synchronous tools: Adobe Connect, Google Hangouts, Google chats, Skype
• Asynchronous tools: Voicethread, MERLOT Content Builder, Prezi, MERLOT GRAPE Camp, Peer Review Workshop and Discussion Forums in LMS
• Reflective tools: Journals, self-assessments, and digital portfolios

Estimate the probable costs associated with this practice: 

The only additional cost would be optional and would involve the use of some emerging technologies that are not open source. All other resources used in this project were open source and we did not incur additional costs using them. There was essentially no budget for this project.

References, supporting documents: 

Astin, A. (1993). Assessment for Excellence. Wesport, CT: Oryx Press.

Barnett-Queen, T., Blair, R., & Merrick, M. (2005). Student perspectives of online discussions: Strengths and weaknesses. Journal of Technology in Human Services, 23(3/4), 229-244.

Brandon, P., Young, D., Shavelson, R., Jones, R. Ayala, C., Ruiz-Primo, M., & Yin, Y. (2008). Lessons learned from the process of curriculum developers’ and assessment developers’ collaboration on the development of embedded formative assessments. Applied Measurement in Education, 21, 390-402.

Gardner, P. (2007). The ‘life-long draught’: From learning to teaching and back. History of Education, 36(4-5), 465-482.

Greenleaf, R. A. (2008). The Servant as Leader. Westfield, IN: The Greenleaf Center for Servant Leadership.

Price, B., & Randall, C. (2008). Assessing learning outcomes in quantitative courses: Using embedded questions for direct assessment. Journal of Education for Business, 83(5), 288-294.

Secolsky, C., & Wentland, E. (2010). Differential effect of topic: Implications for portfolio assessment. Assessment Update, 22(1), Wilmington, DE: Wiley Periodicals.

Walvoord, B. (2003). Assessment in accelerated programs: A practical guide. New Directions for Adult & Continuing Education, 97, 39-50.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Rick Lumadue
Email this contact: 
proflumadue@gmail.com
Effective Practice Contact 2: 
Rusty Waller
Email contact 2: 
rusty.waller@tamuc.edu
Author Information
Author(s): 
Owen P. Hall, Jr.
Institution(s) or Organization(s) Where EP Occurred: 
Graziadio School of Business and Management
Institution(s) or Organization(s) Where EP Occurred: 
Pepperdine University
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

The role of MOOCs in management education is growing rapidly. These developments are part of the educational communities’ response to growing tuitions, budget constraints, limited class availability and globalization. This effective practice has been used to improve both the content and delivery of business statistics in a variety of management educational settings (e.g., undergraduate, graduate and executive programs).

Description of the Effective Practice
Description of the Effective Practice: 

Providing world-class management education in today’s global environment is an ongoing challenge. By any metric or standard management education has reached a seminal point in its brief 100 plus year history. In response one approach being adopted by many business schools is to engage faculty and students in a virtual learning experience. These developments are part of the educational communities’ reaction to growing tuitions, budget constraints, limited class availability and globalization. Basically a MOOC, a recent development in distance learning, is a web-based course designed to provide “free” intellectual content on a very wide scale basis. A statistics fundamentals MOOC was developed and has been in use for several years in several capacities including bootcamps and ongoing refresher courses

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

1) Students found themselves generally more engaged with broad bandwidth applications without technological interference or blockage
2) Students enjoyed the flexibility and convenience offered by the MOOC
3) Students noted the exam to be the most challenging activity; this is not an unexpected finding given the lack of control, i.e., the students had a relatively fixed amount of time to complete the exams
4) Faculty found that students engage in using the MOOC tended to outperform those students that did not access the MOOC
5) Faculty were able to monitor student learning patterns and achievements on a weekly basis

How does this practice relate to pillars?: 

Today, many students engaged in a program of management education are requiring more flexibility because of workplace demands. MOOCs provide a vehicle for meeting these requirements (learning effectiveness). Two key challenges associated with these types of delivery systems, which are growing at a rate five times that of traditional residential programs, are consistency and compatibility. MOOCs are also highly scalable which further enhances their contribution to management education.

Equipment necessary to implement Effective Practice: 

Web

Estimate the probable costs associated with this practice: 

Many MOOCs are presently free. The MOOC outlined in this effective practice is free.

References, supporting documents: 

Bernhard, W.; et. al. 2013. The MOOCs business model. Procedia, Social and Behavioral Sciences, 90, 200.
Breslow, L. 2013. Studying learning in the worldwide classroom research into edX’s first MOOC. Research and Practice Assessment, 8(summer), 13.
Dellarocas,C.; Alstyne, M. 2013. Economic and business dimensions money models for MOOCs. Communications of the ACM, 56(8), 25.
Liyanagunawardena, T.; Adams, A.; Williams, S. 2013. MOOCs: A Systematic Study of the Published Literature 2008-2012. The International Review of Research in Open and Distance Learning, 14(3), 202.
Mackness, J.; et. al. 2013. Learning in a Small, Task–Oriented, Connectivist MOOC: Pedagogical Issues and Implications for Higher Education. The International Review of Research in Open and Distance Learning, 14(4), 202.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Owen P. Hall, Jr.
Email this contact: 
ohall@pepperdine.edu
Author Information
Author(s): 
Stacy Southerland, PhD
Institution(s) or Organization(s) Where EP Occurred: 
University of Central Oklahoma
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

This effective practice was developed primarily to address an academic department’s need to provide tutoring for students in entry-level online and self-paced online Spanish courses comparable to campus-based tutoring it had long provided for students enrolled in face-to-face sections of the same courses. These support services needed to offer the same any time, any place accessibility as the online courses in which the students were enrolled. Another goal of this effective practice was to create a space, a virtual learning community, where any student enrolled in an online or face-to-face section of the university’s beginning level Spanish courses could engage in collaborative learning and explore personalized, informal learning pathways to complement and support their formal—classroom-based--learning experiences. Although faculty for the online Spanish courses held online office hours and provided access to supplemental instructional resources, they found that their students still reported a need for additional tutoring just as face-to-face students did. However, online students could rarely make use of on-campus tutoring and the department did not offer online assistance even though online Elementary Spanish I and II offerings had increased from 1 section to 13 in a few short years and comprised approximately 35% of the department’s first-year course offerings as of fall 2013. Given that online students have the same or greater need for tutoring and have fewer opportunities to engage in collaborative learning than their on-campus peers, it was clear that the need for tutoring options for eLearning students was essential and an effective strategy developed to meet that need. At the same time, traditional students have come to expect and need the same flexibility and access to any time, any place learning support services as online students. Therefore, the idea to create a community comprised of every student enrolled in an Elementary I or II Spanish course—online or face-to-face--at the university was implemented with the goal of providing increased engagement for all learners and faculty in addition to online tutoring services. To meet this need, a “course” site to house diverse tutorial materials, discussion boards, and chat rooms was created using the university’s learning management system and approximately 900 students were enrolled in it.

Description of the Effective Practice
Description of the Effective Practice: 

The need for online tutoring services for students enrolled in online Spanish courses was a demonstrated need that was not being met at the institution where this effective practice was developed. This practice was developed by a faculty member with a background in Spanish language instruction and curriculum development and in designing, teaching, and coordinating online Spanish courses. The focus of this project was not only to provide online tutoring that both online and face-to-face students could benefit from, but also to create a community of learning to promote increased student and faculty engagement and foster informal learning experiences that would complement and support the department’s formal Spanish curriculum. An online community, a “course”, was developed to offer diverse, reliable resources for students to choose from when exploring content relevant to them as they forged their own learning pathways. Among the features the center offered were synchronous tutoring with a department tutor or faculty member through chat or online rooms, discussion boards on a variety of course content topics, print-based and multimedia grammar tutorials, and links to Internet-based tutorials and sites for exploring the Hispanic world. Some faculty members contributed time to working with students in the site, which allowed students to benefit from the expertise and perspective of multiple faculty members. Students could also chat with each other, help each other with questions about course content, assignments, and technology, suggest useful internet sites, and more. In this virtual community, learning could meet students where they were in terms of their skill level and provide them with the information they needed when they needed it.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

This program was developed as a response to the need to provide tutoring to online and self-paced online students in Elementary Spanish I and II in a university setting. Second language acquisition is challenging for adult learners and the pace of university-level courses renders it even more so with the result being that students often report that Modern Language courses are among the most challenging they take. The online setting with its increased time commitment and need for effective supplemental study materials further increases the need for tutoring assistance. Online faculty reported the need for tutoring services for their students who often could not travel to campus for tutoring, which their course fees supported, and online students also expressed an interest in tutoring. The unofficial pilot of this practice--launched with limited promotion and no faculty compensation beyond being able to count it toward contributions in the area of teaching in faculty reviews--took place in the spring of 2013. Almost 900 students were enrolled in the community and approximately 15% of those students used its resources. The official pilot was approved for the summer and fall of 2013 and is currently underway with additional support and features. The faculty member that developed the site received teaching credit for managing it, increased participation was seen by campus-based and online faculty who volunteered as tutors, Intermediate Spanish I & II students were included in the community, and on-demand mobile tutoring was added to supplement regular tutor hours in the community’s chat and online rooms. The faculty member who developed the site is providing feedback on the time commitment required to maintain the project and on site participation in order to inform administration about the logistics of applying this effective practice in other departments as a way to improve retention, recruitment and learning outcomes. In addition, the practice was a central component of a portfolio that was recently recognized with an award for exceptional initiative in support of community and leadership awarded by the university president. An overview of the concept and considerations for its development and implementation have been shared at both university-wide and national venues such as university faculty enhancement sessions and a national eLearning conference hosted by the institution. The practice has also been submitted for an upcoming D2L Ignite regional conference. Presenting this effective practice will increase awareness of one of its primary advantages, which is the ability to replicate it with minimal required resources in other departments and institutions.

How does this practice relate to pillars?: 

This effective practice has Learning Effectiveness as it relates to community building and communications at its core, but also relates to the pillars of Student and Faculty Satisfaction, Scale and Access. At the institution where it was developed, this practice created a virtual community for collaborative learning experiences and facilitated connections and interaction between online and on-campus students and faculty. Students and instructors could participate actively in tutoring and discussion boards or vicariously by browsing information shared publicly by others who posted and responded to questions and participated in chats that were archived for access at any time by members of the learning community. The content presented diverse resources for students to select from based on their learning style, background in the discipline, current study needs, and personal interests. Tutorials and other resources related to the formal curriculum were selected by a faculty expert with knowledge of that program. Students could then choose resources based on their individual needs to create personalized informal learning pathways to complement and support their formal learning experiences and improve their learning outcomes. Both students and faculty benefitted from the advantages of increased opportunities for engagement with content and other learners and all students benefitted from access to the online tutoring services. Faculty teaching experiences are also enhanced as they can draw on the contributions of a network of faculty to assist students. Many campus faculty lack the expertise to design and deliver online-based learning experiences even though their students would benefit from eLearning enhanced learning and the advantages of flexibility and accessibility eLearning concepts offer. With this effective practice, faculty could simply direct their students to the resource that others built and they could also contribute to the site without the need for extensive of knowledge of technology by posting replies to discussions or participating in chat rooms, concepts with which most faculty were familiar.
This practice also addresses the pillar of Scale as it was developed using systems already in place at the university, rendering it very cost effective. Moreover, all components, resources, and training (for tutors) could be delivered or accessed online at one’s convenience, thereby reducing expenses related to the use of facilities for housing tutor offices and related to demands on faculty and staff time for training. Course fees to support tutoring services were already in place at the institution, but had not been utilized to provide online tutoring services for this department.
By facilitating online, customized, informal learning experiences to support formal course curriculum, this practice addresses Access with the anytime, anywhere and timely availability of study resources. Also, resources were continually evaluated and new ones added to ensure that they reflected learner interests and needs. These improvements and additions were based on input from students and faculty regarding their needs based on their teaching and learning experiences in formal settings. The many resources housed in the virtual learning addressed diverse learning styles and include asynchronous and synchronous options so as to further address learner access. In addition, student surveys and comparisons of learning outcomes for students that make use of the site and those that do not could assist institutions in future efforts to assess student learning.

Equipment necessary to implement Effective Practice: 

This practice was implemented by creating a portal much like an online course site in the university’s learning management system, Desire2Learn. Qualtrics, an online survey delivery system already in use at the university, was utilized to create student and faculty surveys to assess usage and effectiveness.

Estimate the probable costs associated with this practice: 

This effective practice can be developed and maintained at no or few additional costs beyond those already incurred to meet the technology infrastructure needs for other uses at many institutions. The learning management system used as a portal for the center, the survey assessment system, and faculty and student tutor wages are the primary potential expenses. However, many institutions already use a learning management system that can be utilized for this practice. Many survey systems like the one used for assessing site effectiveness are also already available at many institutions or a learning management system’s survey tool or other free survey programs could be used for the same purposes for no additional costs. Faculty and student tutor wages are the primary potential additional expense, and even those can be lowered or even eliminated through creative solutions. For example, the faculty member that conceived of this community of learning, designed its layout, and maintained and facilitated it during its first semester donated time to the project, but was able to benefit by counting the activity as evidence of required service to the department. Student tutor wages can be eliminated or reduced by replacing or supplementing student tutors with the faculty chat options that already form part of the community. Faculty engagement in discussions and chat to encourage student-faculty interaction and create a larger community of contributors—was piloted in a summer session and was found to be sufficient to fulfill the live tutoring needs of the site. Those faculty members can use time volunteered as evidence of service to the department for review purposes. No cost tutoring can also be provided by way of a practicum in which a student serves as a tutor and receives course credit rather than hourly wages. Again, faculty who supervise practicums are often not paid additional wages but can use the activity as evidence of teaching effectiveness and engagement, student mentoring, or service. When it is determined that it is preferable to pay faculty and tutors wages, the institution found the minimal expense of 1 hour teaching credit for the faculty and an hourly wage for the tutor to be minimal and worth the investment to provide strategies to promote students success and transformative learning experiences. In addition, many institutions have provisions for using a portion of student fees to provide tutoring services.

References, supporting documents: 

•Slideshow tour of the virtual learning community: https://docs.google.com/file/d/0B0KLBJtR5kVNczF1d1phWXEtR2c/edit?usp=sha...

•Student site usage survey instrument (URL): https://uco.co1.qualtrics.com/SE/?SID=SV_51Dz93VweYW2cyp

Contact(s) for this Effective Practice
Effective Practice Contact: 
Stacy Southerland, PhD
Email this contact: 
ssoutherland@uco.edu