Faculty Satisfaction

FACULTY SATISFACTION means that instructors find the online teaching experience personally rewarding and professionally beneficial. Personal factors contributing to faculty satisfaction with the online experience include opportunities to extend interactive learning communities to new populations of students and to conduct and publish research related to online teaching and learning. Institutional factors related to faculty satisfaction include three categories: support, rewards, and institutional study/research. Faculty satisfaction is enhanced when the institution supports faculty members with a robust and well-maintained technical infrastructure, training in online instructional skills, and ongoing technical and administrative assistance. Faculty members also expect to be included in the governance and quality assurance of online programs, especially as these relate to curricular decisions and development of policies of particular importance to the online environment (such as intellectual property, copyright, royalties, collaborative design and delivery). Faculty satisfaction is closely related to an institutional reward system that recognizes the rigor and value of online teaching. Satisfaction increases when workload assignments/assessments reflect the greater time commitment in developing and teaching online courses and when online teaching is valued on par with face-to-face teaching in promotion and tenure decisions. A final institutional factor-crucial to recruiting, retaining, and expanding a dedicated online faculty-is commitment to ongoing study of and enhancement of the online faculty experience.

Effective Practice Awards Submissions Due June 30

Submitted by janetmoore on May 27, 2010 - 2:06pm
New effective practices  submitted by June 30 are eligible for awards to be presented at the July 21, 2010 Emerging Technologies for Online Learning Symposium Awards Presentation Luncheon.
Thousands visit effective practices for innovative practices supported by eviden
Author Information
Author(s): 
Bevin Clare
Institution(s) or Organization(s) Where EP Occurred: 
Maryland University of Integrative Health
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

A peer-case-based learning course was converted to serve a blended learning clinical program. The course objectives drive a peer-to-peer learning experience with faculty serving as “guides” in the process rather than the experts. This delivery emphasizes the impact of student generated ideas and critical thinking and minimizes the common focus that there is a “right” way to approach individualized client care. Peer-led experiences are the focus of this course and the subsequent learning environment is intended to serve both students and faculty.

Description of the Effective Practice
Description of the Effective Practice: 

A peer-to-peer-case-based learning course was converted to serve a blended learning clinical program. The conversion of the program containing a team – taught case-based learning class was conducted between 2013 – 2014. Prior to conversion, this on-campus case-based learning course was offered for close to a decade with minimal adaptation. Upon conversion of our clinical program to a blended program, this course underwent significant alteration with contribution from previous faculty and students involved in this course.

The course was held online as student clinicians were also onsite for their clinical internships. It was held as a 12 week course with eight faculty offering specific weeks. In each week a detailed biomedical case was provided by the faculty member and a “lead-student” was assigned. The lead student would review the case prior to the start of each week and assign a detailed question about the case to each of her peers. By the middle of each week, each student will post a thorough response using adequate biomedical references to the forum as well as a response to a peer’s posting. In practice, the forums were busy areas of conversation more than the expected singular response.

Concurrently, in a separate forum, the lead student would also post their perceived assessment of the case as well as their clinical care goals (both long and short term). Each non-lead student would then post their own, highly specific, recommendations for the client including dietary, lifestyle, and therapeutic medicinal prescriptions. Peer critique and response posting was also required.

Faculty contributed to both forums, moderating the biomedical question responses and adding to, or correcting if needed, the student responses. They also commented on the goals and the plans and ultimately contributed their own strategies to the mix.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

Please see the attached supportive documentation for evidence as well as the following summary.

Overview
The conversion of the program containing a team – taught case-based learning class was conducted between 2013 – 2014. Prior to conversion, this on-campus case-based learning course was offered for close to a decade with minimal adaptation. Upon conversion of our clinical program to a blended program, this course underwent significant alteration with contribution from previous faculty and students involved in this course.
A survey of the students in their newly designed course was conducted, as was a survey of the faculty team (many new to any type of alternative delivery) who were able to compare (from their perspective) the new adaptation to the old format.

Student Survey Results
Nine of the twelve students participating in this course responded to the survey. Overall, all students reported learning “Significantly” or “Immensely” from 1) their peers, 2) In their own research, and 3) Writing answers to the peer-generated questions. No students reported learning “Minimally” or “Nothing” in any of the categories listed, although two students reported learning only “Somewhat” from the faculty (as a peer driven course this is partially expected and is consistent with previous feedback).
Being assigned a question from their peers was rated as an effective way to learn by 100% of the responders, and the assigning the questions to their peers was seen as an effective way to learn by 89% of them. Central to peer –based learning, 95% of students reported reading all or most of their peers submissions in all categories.
Overall, 100% of the students felt this was an “effective way to explore case studies”.

Faculty Survey
Six faculty who team-teach this course were surveyed on their perception of the course, often in relation to their prior experience with the F2F version. All faculty had previous taught in this course, generally for many years. For many of them, it was their first experience with alternative methods of delivery.

All faculty agreed that students were learning “Immensely” or “Significantly” from 1) their peers, 2) in their own research and 3) by writing their goals and plans. Answering peer-based questions was more controversial from faculty perception (although not from student perception).

Faculty were also surveyed about their own learning in the course and 100% of them reported that they themselves learned from the answers students reported to the peer-generated questions. More than half of them also reported learning from the questions from students, the goals and plans, and from their own research and preparation.
In the direct comparison of the online and F2F versions of the course, faculty overwhelmingly reported that he online version of this course was similar or better for all categories (except student engagement which was scored, on average, similar to the in class environment) particularly for biomedical understanding and peer-to-peer learning.

Overall, 100% of faculty felt that this was an effective way to explore case studies and 83% of them felt this was an overall “improved educational experience over the F2F version”.

How does this practice relate to pillars?: 

This practice relates to pillars in three areas.

Most importantly, the learning effectiveness of the peer-to-peer case based learning was critical. In comparison to the F2F environment students were more equally engaged and given adequate time to bring in significant outside resources. In the surveys conducted, students and faculty overwhelmingly reported high scored in learning effectiveness.

Faculty satisfaction in this was was reflected by the surprisingly high number of faculty reporting significant learning beyond their more standard preparation for the course. In the survey, 100% of faculty reported they learned from the replies to the peer-generated questions.

Lastly, student satisfaction was significant to this course, with 100% of students feeling this was an effective way to engage in case-based peer-to-peer learning.

Equipment necessary to implement Effective Practice: 

An LMS or comparable online discussion forum.

Estimate the probable costs associated with this practice: 

N/A

References, supporting documents: 

Please see the attached document for their student and faculty surveys.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Bevin Clare
Email this contact: 
bclare@muih.edu
Award Winner: 
2014 Sloan-C Effective Practice Award
Author Information
Author(s): 
Brichaya Shah
Author(s): 
David E. Stone
Author(s): 
Derrick Sterling
Author(s): 
Kathryn C. Morgan
Institution(s) or Organization(s) Where EP Occurred: 
Instructional Design Unit: Office of Faculty Support and Development
Institution(s) or Organization(s) Where EP Occurred: 
Southern Polytechnic State University
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

The Teaching Academy for Distance Learning (TADL) was created to provide faculty with a formal certification process. This process also helps faculty develop quality online courses. The program format was initially a 47-hour, 3-part training course for Southern Polytechnic State University (SPSU) faculty. This program began in the fall 2008 and mostly focused on technology tools and their use in instruction. As part of an ongoing quality improvement process, TADL has continued to enhance the online learning capabilities at SPSU.

The program’s dependency on a particular software has decreased as the program has evolved. The Instructional Design Unit (IDU) has worked with faculty in the development of online courses across academic disciplines. Their balance between pedagogy and technology allows minimal changes in the program format as new technologies emerge. Key to the program is the team-based approach. This approach brings in expertise from instructional design, instructional technology, as well as digital media.

The TADL program is housed within the faculty-driven Center for Teaching Excellence (CTE). The CTE has explored a broad range of teaching and learning activities. It has built strong relationships across campus. Faculty value the CTE and their partnership helps validate the activities of the Teaching Academy for Distance Learning.

TADL evolved from a single face-to-face only program into three versions: face-to-face, online, and blended formats. Within the program, faculty from across campus are brought together to build online courses as well as discuss issues related to online learning. This has created a community of practice around online learning. This community supports informal learning networks within the institution and has allowed for growth in online learning.

Description of the Effective Practice
Description of the Effective Practice: 

TADL is not just a faculty development program. It is a truly hands-on program that allows faculty to learn new skills and acquire knowledge to design, develop, and deliver quality online courses at SPSU. Some of the components of this course include: weekly meetings (face-to-face or online), multimedia-rich learning modules, and interactive learning objects that address different learning styles. This course also includes assignments that allow faculty to apply their newly acquired skills. While completing the course, participants in TADL have full access to a diverse team of instructional designers, digital media specialists, and an instructional technology specialist.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

Upon completion of TADL, all of the newly developed courses were sent out for external review. These reviews were completed by instructional design professionals working in the field. This process resulted in a 100% pass rate for the 5 years that TADL has been offered. As the program matured, it went from an informal process to a more formal review of the course objectives, modular objectives, and course alignment. This review includes a Subject Matter Expert reviewer from each participants’ department.

Course components are developed during the TADL program and feedback is given to participants as they progress. This continuous review, in combination with the external review of the developed courses, provides multiple opportunities for feedback.

TADL was developed for several reasons. First, single workshops and short term training sessions were not valued as significant professional development for faculty. Second, there was a demand for a more in-depth exploration of online learning and course development. Since developing TADL, this program has become recognized and supported by several deans and department chairs who insist that new hires go through this program. They also insist that the department adopt some of the practices that TADL instills in its participants.

How does this practice relate to pillars?: 

The TADL instructors’ practice aligns with the pillars of “Faculty Satisfaction”, “Learning Effectiveness”, and “Scale”. Faculty are empowered by the TADL experience and develop a support network with their peers. This allows for continuing discussion and learning outside of TADL. TADL is now offered in multiple formats (Hybrid, Fully-Online Instructor-led, and Self-Paced) to accommodate faculty schedules and learning preferences. Many of the resources for TADL are re-used between formats. As a result of the TADL experience, some departments have developed standard templates for their courses that have unified the student experience throughout their academic program. SPSU instructors exhibit learning effectiveness because after the successful completion of TADL, participants can continue to develop quality online courses that constantly improve based on the available technologies.

Equipment necessary to implement Effective Practice: 

Many of the resources necessary to build this program would already exist at most institutions. We have made use of a classroom equipped with computer stations and common university software. Infrastructure required includes the learning management system, as well as a desktop/web conferencing solution.

Estimate the probable costs associated with this practice: 

We provide a small stipend ($1,000) for TADL participation and departments often pay the faculty for the development of the course built as part of TADL, with the amount at the department’s discretion. This amount is often roughly the amount adjunct faculty are paid to teach courses.

References, supporting documents: 

An extensive description of the program, along with videos and TADL materials are available online at:
http://spsu.edu/instructionaldesignsupport/TADL/index.htm

Contact(s) for this Effective Practice
Effective Practice Contact: 
Brichaya Shah
Email this contact: 
bshah@spsu.edu
Effective Practice Contact 2: 
Kathryn C. Morgan
Email contact 2: 
kmorgan@spsu.edu
Effective Practice Contact 3: 
David E. Stone
Email contact 3: 
dstone@psu.edu
Award Winner: 
2014 Sloan-C Effective Practice Award
Author Information
Author(s): 
Wendy Cowan
Author(s): 
Mark Gale
Institution(s) or Organization(s) Where EP Occurred: 
Athens State University
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

Research in online student retention suggests that both time and relationships play a critical role in student persistence. Providing courses online does address convenience as it relates to student time constraints, but once inside the online classroom, it’s imperative that instructors find creative ways to deliver instruction that leads to student engagement. Students become more engaged when relationships are formed – with both the instructor and peers. Virtual classroom sessions, while seeming to be one solution for forming relationships, conflict with the convenience of taking an online class. To counteract this inconvenience, instructors teaching online and blended sections of the same course decided to create a learning community that offered multiple times and dates for virtual class sessions. The results have led to increased satisfaction and engagement for both students and faculty.

Description of the Effective Practice
Description of the Effective Practice: 

Across universities, each semester there are some courses that are offered and taught by multiple instructors. For example, English 101 could possibly be offered in the schedule across 10, 20 and even more sections. Some of these sections are offered in online format.
In the Athens State College of Education, we offer three courses that are taken by all College of Education majors – Foundations of Education I, Foundations of Education II and Technology and Media for Educators. Each semester we offer at least 10+ sections of each of these courses, with over half of them offered in a blended or online format.
In an effort to help establish positive relationships in these online courses, we implemented weekly virtual classroom sessions, which isn’t a new idea. But because we have multiple instructors teaching sections of the same course, we went a step further and created a community calendar where each instructor posts the date, time, topic and entry URL to his/her virtual classroom sessions (See supporting documents). Instructors are encouraged to schedule their sessions at different times/days throughout each week so that students have many options for attending live sessions.
Instructors conduct virtual sessions at the scheduled time/date weekly and record the session. Archived sessions are made available inside courses for students who are unable to attend the live sessions. Upon completion of each session, instructors ask for the names of any “visiting” students. Visiting student’s names are then sent to the other instructors so that students are given credit for attendance.
Inside each course, students are provided with a link to the community calendar and are informed that they may attend any session(s) offered. Students who were unable to attend are required to watch and summarize archived sessions.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

While the initial goal was to improve relationships within the course, the results have far exceeded our imaginings. Students reported that they not only appreciate the availability of the live sessions, but have also stated that the sessions help them feel like they are in a “real classroom.” Students have also reported that they appreciate the ability to choose session times/dates that best meet their needs. This evidence of effectiveness was expected (See supporting documents).
Evidence of effectiveness that was not anticipated is the instructor’s perceptions of teaching and learning effectiveness and overall satisfaction. Prior to initiating the across-section virtual classroom sessions, instructors completed a pretest measuring faculty satisfaction. Upon completion of the first semester of implementation instructors completed the posttest. A review of the data indicates that faculty are more satisfied with their role in the course following the semester of weekly virtual classroom sessions (See supporting documents).

How does this practice relate to pillars?: 

Learning effectiveness: Instructors felt more empowered as a result of this effective practice. Data suggests that instructors perceived an increased contribution to student learning. One hundred percent of instructors surveyed felt that students had a valuable learning experience due to the instructor’s role in the class. When comparing the amount of content able to be taught between an online/blended course and a traditional course, most instructors (73%) reported that they could now teach the same or more content. Seventy six percent of the instructors reported that they were more satisfied with their online/blended course the semester following the virtual classroom implementation. Instructors (85.7%) believed that the virtual classroom sessions improved student success (See supporting documents).
Faculty satisfaction: Posttest data from the virtual classroom implementation suggests that faculty are pleased with teaching online/blended courses. On the pretest survey 86.7% of instructors reported that they were very satisfied teaching an online/blended course. Following the virtual classroom sessions implementation 76.9% of instructors reported being more satisfied than they were the previous semester. Considering that that most instructors had already reported being very satisfied, this is a very indicative finding regarding faculty satisfaction (See supporting documents).
Student satisfaction: Student surveys indicate that students are satisfied with the availability of the virtual classroom sessions. Approximately 55% of students reported that the virtual classroom sessions were beneficial (See supporting document).

Equipment necessary to implement Effective Practice: 

The equipment we used to implement this effective practice was Blackboard Collaborate and Blackboard Wimba, which are virtual classroom platforms. Google Docs was used for the community calendars.

Estimate the probable costs associated with this practice: 

While neither Wimba nor Collaborate are free, there are other virtual classroom options that are free and low cost. For example, Google Hangouts could be used for virtual classroom sessions.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Wendy Cowan
Email this contact: 
wendy.cowan@athens.edu
Effective Practice Contact 2: 
Mark Gale
Email contact 2: 
Mark.Gale@athens.edu
Award Winner: 
2014 Sloan-C Effective Practice Award
Author Information
Author(s): 
Rick Lumadue, PhD
Author(s): 
Rusty Waller, PhD
Institution(s) or Organization(s) Where EP Occurred: 
Texas A&M University-Commerce
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

Programmatic student-learning outcomes of an online master’s degree program at a regional University in Texas were assessed in this study. An innovative use of emerging technology provided a platform for this study. The Astin Model provided the framework for the evaluation. This study has provided a model for conducting well-informed, instructional and programmatic assessments of student-learning outcomes. The results of this study demonstrated that emerging technology can provide a platform for students to both showcase and preserve their ability to meet programmatic student-learning outcomes.

Description of the Effective Practice
Description of the Effective Practice: 

This online master’s degree program is taught using a fully interactive online format in a primarily asynchronous delivery model. Asynchronous activities used in the program included: threaded discussion, video and audio presentations, written lecture linked to video and audio presentations embedded into the course management system, Voicethreads, faculty developed MERLOT web pages created using the MERLOT Content Builder, e-Textbooks, etc.
The Astin Model (1993) provided a framework for this assessment. In the Astin Model, quality education not only reaches established benchmarks but also is founded upon the ability to transition students from where they are to reach intended competencies. An innovative use of MERLOT Content Builder combined with emerging technology provided a means for assessing the seven student-learning outcomes in an online master’s program at a regional university in Texas.
Two full-time faculty and one adjunct faculty used rubrics to evaluate each of the programmatic student-learning outcomes by assessing a random sample of student assignments from courses.
The goal of this study was to help students reach the intended learning outcomes for metacognition, digital fluency, communication, cultural fluency, global fluency, servant leadership, and commitment to life-long learning. Definitions of these learning outcomes are provided here. Students will evidence metacognition by demonstrating the knowledge and skills for designing, developing, and evaluating personal strategies for learning and leading. Students will evidence digital fluency in the adoption and integration of appropriate technologies into digital presentations. Students will be able to communicate ideas and content to actively engage participants. Students will evidence understanding of generational and cultural learning styles. Students will develop instructional materials appropriate for a global perspective. Students will practice the principles of servant leadership as espoused by Robert Greenleaf in his work titled, The Leader as Servant (1984). According to Greenleaf, “The servant-leader is servant first. It begins with the natural feeling that one wants to serve first. Then conscious choice brings one to aspire to lead. Students will evidence a commitment to lifelong learning in the production and evaluation of learning materials.
Digital education presents many challenges. Barnett-Queen, Blair, and Merrick (2005) identified perceived strengths and weaknesses of online discussion groups and subsequent instructional activities. Programmatic assessment is required for all institutions accredited by the Council of Higher Education Accreditation or the US Department of Education. Walvoord (2003) indicated that good assessment should focus on maximizing student performance. The following questions rise to the forefront: (1) Have graduates mastered programmatic expectations; (2) What relationships exist between student performance and other factors; and (3) How can faculty improve the program based upon the analysis of student performance. Walvoord further stresses the importance of direct assessment in determining student performance. Indirect measures may provide evidence of student-learning, but direct assessment is widely viewed as more valid and reliable.
Brandon, Young, Shavelson, Jones, Ayala, Ruiz-Primo, and Yin (2008) developed a model for embedded formative assessment. The model was collaborative and stressed embedded assessment. Their study stressed the difficulties associated with broad-based collaboration given the difficulties of formally identifying partners and spanning large geographic distances. Price and Randall (2008) demonstrated the importance of embedded direct assessment in lieu of indirect assessment. Their research revealed a lack of correlational fit between indirect and direct assessment of the same aspect of student-learning with the same course in a pre- and post-test design. They documented a difference between student perceived knowledge and actual knowledge. These findings further underscore the importance of direct assessment of student-learning. Walvoord’s (2003) findings further indicated the need for embedded direct assessment of student-learning owned and supported by those who will implement the change. Those implementing change would include program faculty and students.
Gardner (2007) found that education has long wrestled with defining and assessing life-long learning. Though loosely defined as the continued educational growth of the individual, lifelong learning is rapidly rising to the forefront of 21st century education to assume a more prominent place than that held in the 20th century. Brooner (2002) described the difficulty of assessing the intention to pursue learning beyond the completion of a program. Intention and subsequent performance are affected by many different factors including, but not limited to, normative beliefs and motivation. Educational programs have often been encouraged to avoid assessment of behavior beyond the point of graduation as such behavior as been viewed as beyond the control of program educators (Walvoord, 2003). The question arises as to the importance of future behavior as an indicator of current learning.
Astin (1993) pointed out that educators are inclined to avoid assessment of the affective domain viewing such as too value laden. Accordingly, the cognitive domain became the defacto assessment area though affective assessment more closely paralleled the stated aims and goals of most institutions of higher education. The avoidance of assessment in the affective domain is well documented by Astin. The advent of social media tools coupled with e-portfolios offers some intriguing possibilities in regard to assessment in the affective behavioral domain. Astin pointed out that a change in the affective domain should translate into changed behavior.
Secolsky and Wentland (2010) found many advantages to portfolio assessment that transcend regular assessment practices by providing a glimpse into non-structured behavioral activities. Behavior beyond the classroom can be captured and documented within a properly designed portfolio. Behavior that has not been directly observed by the teacher can be measured in light of portfolio submissions via a broad collection of relevant and targeted information. Established performance criterion can be assessed to measure student-learning and determine specific areas for programmatic improvement. Though Secolsky and Wentland point out that reliability and validity concerns still exist with portfolio measurement, they concur that portfolio assessment potentially gauges authentic student performance outside the educational environment. With the development of a portfolio transportable beyond program enrollment and across the life experience the opportunity exists to assess the impact of the instructional experience upon real time student performance. Evaluation of life-long portfolios promises to provide meaningful insight into the real life impact of the educational experience. Astin (1993) viewed changed behavior over time as the real evidence of affective enlightenment.
An interesting finding from this study was the creative manner in which some of the students layered or nested other web 2.0 technologies into their MERLOT web pages. Examples of layering or nesting included embedded student developed Voicethread presentations, embedded open-ended discussion Voicethreads used to promote participation and feedback, embedded YouTube Videos, embedded Prezis and the like.
The integration of MERLOT GRAPE Camp peer review training into this Master Degree Program has provided an additional platform for further research to be conducted relative to the assessment of all seven of the programmatic learning outcomes of the program. For example, metacognition may be assessed as it relates to MERLOT’S peer-reviewers serving as content expert in assessing materials that pertain to one’s field. Communication may be assessed through interaction with peers and peer-reviews. Digital fluency is obviously what is required to contribute to MERLOT. Cultural Fluency may be demonstrated through peer reviewing submissions of MERLOT’s international community of partners. Global Fluency may be measured through the development and contribution of appropriate content for use in a global community of learners. Servant Leadership is the motto of MERLOT, “Give a Gift not a Burden!” (Gerry Hanley, 2010). Finally, the development of students into lifelong learners will help to establish the identity of the program. Student performance outside of the program is one of the best measures of student-learning and the MERLOT Content Builder along with MERLOT peer-reviews is a tremendous platform for measuring student-learning outcomes.
Life long learning may be assessed by current and former students’ contributions of materials to MERLOT and by those providing peer reviews of materials contributed to MERLOT. As a benefit of being a MERLOT partner, the dashboard report provides information on contributions made by members of the partner organization. Contributions and/or peer reviews completed by students who have graduated from the program will be recorded in the dashboard report. This is a tremendous tool to measure the commitment to life long learning. Ultimately, this study has demonstrated that the MERLOT platform combined with emerging technology are integral in assessing student-learning outcomes in an online master’s program at a regional University in Texas. Other online degree programs should seriously consider the MERLOT Content Builder’s potential to help them assess student-learning outcomes.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

The Online Master of Science in Global eLearning equips specialists in education for practice in public education, private education, business, industry, and non-profit organizations. Learning and technology are intertwined as we develop the next generation of enhanced training, development, and teaching to engage learners with key components of instructional technology. Technology provides access to all forms of education and this program will teach educators how to implement technology across curricula and classrooms of all kinds. With a blend of theory and technical skills, this program will prepare teachers and corporate trainers alike.

Metacognition – Students will demonstrate the knowledge and skills for designing, developing, and evaluating personal strategies for learning and leading.
5 journal entries will be selected at random from a course offered in Fall 2012. These will be evaluated by the fulltime bachelor’s and master’s faculty using the Global eLearning Metacognition rubric. Scores will be deemed acceptable at an average of 4.0 or higher on a 5 point scale in each of the areas of context & meaning, personal response, personal reflection, and interpretive skills.

The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Context & Meaning 4.27
Personal Response 4.13
Personal Reflection 4.40
Interpretive Skills 4.47

All standards were met.
Though all standards were met, the faculty noted that the personal response section scored the lowest at 4.13. Accordingly the course, EDUC 595 Research Methodology, was expanded to include more opportunities for students to provide self and peer-evaluation feedback on projects and assignments. Two assessments were recommended for AY 2013-2014. We will assess one course in the Fall and one course in the Spring.

Communication – Students will communicate ideas and content to actively engage participants.
5 student digital presentations will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Assessment of Digital Student Presentation Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 42 on a 50 point scale in each of the five areas of purpose, organization, content, language, and voice & tone. The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Purpose 45.33
Organization 46.67
Content 46.00
Language 44.00
Voice & Tone 44.67
Technology 45.33

All standards were met. Though, all standards were met Faculty noted that Language scored the lowest. The faculty decided to conduct two assessments for the next cycle. One will be done in the fall and one in the spring.

The faculty modified an assignment in EDUC 515 Intercultural Education to provide students an opportunity to develop their language skills on a project to provided heightened sensitivity to language that might be offensive in other cultures.

Two assessments were recommended for AY 2013-2014.

Digital Fluency - Students will evidence digital fluency in the adoption and integration of appropriate technologies into digital presentations.
5 student digital presentations will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Assessment of Digital Student Presentation Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 45 on a 50 point scale in the area of technology.

The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Technology 45.33

The standard was met.
The faculty noted that the students tended to use more familiar software and avoid the utilization of emerging software. Accordingly, EDUC 510 Utilizing Effective Instructional Technology was modified to include requirements for the utilization of at least one Web 2.0 software program to complete an assignment.

The faculty will conduct two evaluations in AY 2013-2014.

Cultural Fluency – Students will evidence understanding of generational and cultural learning styles.

5 student digital presentations will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Cultural Fluency Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 3.0 on a 4 point scale in the areas of knowledge & comprehension, analysis & synthesis, and evaluation.

The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Knowledge & Comprehension 3.53
Analysis & Synthesis 3.07
Evaluation 3.67

The standard was met. The faculty noted that analysis and synthesis scored lowest. Accordingly the curriculum for EDUC 552 Global Fluency was expanded to include group projects on the education system of other cultures.

The faculty will also conduct two evaluations in AY 2013-2014.

Global Fluency – Students will develop instructional materials appropriate for a global perspective.

5 group project entries will be selected at random from a course offered in Summer 2012. These will be evaluated by the fulltime bachelor’s and master’s faculty using the Global eLearning Global Fluency Rubric. Scores will be deemed acceptable at an average of 2.8 or higher on a 4 point scale in each of the areas of knowledge & comprehension, application, and evaluation.

The assessment was conducted by two fulltime and one external faculty on July 22, 2013. The external was added to strengthen the review. Results were as follows:

Knowledge & Comprehension 2.87
Application 3.00
Evaluation 2.87

The standards were met.

Faculty found student performance in this area to be adequate. Some challenges were noted in the use of stereotypes in identifying people from other cultures. For example, a student made a comment on. EDUC 515 Intercultural Education will be expanded to include a project in which students will interview someone from a different culture to discover differing worldviews of other cultures and share these findings in a forum with classmates.

Servant Leadership – Students will practice the principles of servant leadership as espoused by Robert Greenleaf.

5 student group project self-assessment packets will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Servant Leadership Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 40 on a 50 point scale in each of the five areas of purpose, organization, content, language, and voice & tone.

The assessment was conducted by two fulltime and one external faculty on July 22, 2013. The external was added to strengthen the review. Results were as follows:

Servant Leadership 41.33
Strategic Insight & Agility 39.33
Building Effective Teams & Communities 44.00
Ethical Formation & Decision Making 43.33

The standard was NOT met for Strategic Insight & Agility.

Faculty noted problems in the effective feedback of peer-evaluation assignment. Accordingly, the group peer assessment process has been expanded to include MERLOT GRAPE Camp to provide training on conducting peer-evaluations. All students will be required to complete MERLOT GRAPE Camp training. These changes will be enacted in all new course sections.

Commitment to Life-Long Learning – Students will evidence a commitment to lifelong learning in the production and evaluation of learning materials. 5 portfolio entries will be selected at random from a course in Fall 2012. These will be evaluated by the fulltime bachelor’s and master’s faculty using the Global eLearning Commitment to Life-long Learning rubric. Scores will be deemed acceptable at an average of 3.0 or higher on a 4 point scale in each of the areas of production of educational materials, publications, presentations, including personal response, personal evaluation, and interpretive skills.
The assessment was conducted by two fulltime and one external faculty on July 22, 2013. The external was added to strengthen the review. Results were as follows:

MERLOT Web Pages 3.4
Presentations 3.8
Peer Evaluations 3.60

The standard was met. Though, all standards were met Faculty noted that MERLOT Web pages scored the lowest. The faculty decided to conduct two assessments for the next cycle. One will be done in the fall and one in the spring.

The faculty modified an assignment in EDUC 528 Intro. to Presentation Design to make the MERLOT Web page a requirement rather than an option

Two assessments were recommended for AY 2013-2014.

How does this practice relate to pillars?: 

1) Leveraging MERLOT Content Builder with emerging technology to assess programmatic student learning outcomes is scalable because it encourages more online instructors and instructional designers to consider integrating this model to measure the effectiveness of assignments in meeting the goals for Institutional effectiveness planning.

2) Increases access by providing open access using MERLOT’S Content Builder combined with emerging technology to showcase learning outcomes for students and faculty to assess regardless of location as long as they have an internet connection.

3) Improves faculty satisfaction by providing faculty with open access to evaluate student assignments to assess programmatic student learning outcomes for Institutional effectiveness planning.
Since this model was used to complete a recent Institutional Effectiveness Plan for an online master’s degree program in preparation for a regional accreditation visit other instructors can easily replicate this model to evaluate their programs.

4) Improves learning effectiveness by providing instructors with effective online strategies that are supported by empirical data from assessments of random samples of student assignments .

5) Promotes student satisfaction by providing valuable opportunities for interaction with their instructor and other students. Students work together on group projects for both synchronous and asynchronous presentations. Students are also assigned group and individual projects to evaluate the work of their peers and provide feedback. Rubrics are embedded in the grade book of the LMS to evaluate student assignments. Also, an evaluation tool of the programmatic student-learning outcome that is tied to the assignment is also included in the grade book to assess the level of student understanding. Students regularly comment about how valuable these practices are to their learning experience.

Equipment necessary to implement Effective Practice: 

The only aspect completely necessary is an internet connection and an LMS. In our program, the students also used Camtasia, Quicktime and Captivate for creating videos to complete some of their individual projects. Group projects were completed using Google+ Hangouts, Skype, Voice Thread and Adobe Connect. Students also created MERLOT web pages, MDL 2 Courses and digital portfolios.

Some of the tools we used have costs associated with them. Here is a list of some them:

• Synchronous tools: Adobe Connect, Google Hangouts, Google chats, Skype
• Asynchronous tools: Voicethread, MERLOT Content Builder, Prezi, MERLOT GRAPE Camp, Peer Review Workshop and Discussion Forums in LMS
• Reflective tools: Journals, self-assessments, and digital portfolios

Estimate the probable costs associated with this practice: 

The only additional cost would be optional and would involve the use of some emerging technologies that are not open source. All other resources used in this project were open source and we did not incur additional costs using them. There was essentially no budget for this project.

References, supporting documents: 

Astin, A. (1993). Assessment for Excellence. Wesport, CT: Oryx Press.

Barnett-Queen, T., Blair, R., & Merrick, M. (2005). Student perspectives of online discussions: Strengths and weaknesses. Journal of Technology in Human Services, 23(3/4), 229-244.

Brandon, P., Young, D., Shavelson, R., Jones, R. Ayala, C., Ruiz-Primo, M., & Yin, Y. (2008). Lessons learned from the process of curriculum developers’ and assessment developers’ collaboration on the development of embedded formative assessments. Applied Measurement in Education, 21, 390-402.

Gardner, P. (2007). The ‘life-long draught’: From learning to teaching and back. History of Education, 36(4-5), 465-482.

Greenleaf, R. A. (2008). The Servant as Leader. Westfield, IN: The Greenleaf Center for Servant Leadership.

Price, B., & Randall, C. (2008). Assessing learning outcomes in quantitative courses: Using embedded questions for direct assessment. Journal of Education for Business, 83(5), 288-294.

Secolsky, C., & Wentland, E. (2010). Differential effect of topic: Implications for portfolio assessment. Assessment Update, 22(1), Wilmington, DE: Wiley Periodicals.

Walvoord, B. (2003). Assessment in accelerated programs: A practical guide. New Directions for Adult & Continuing Education, 97, 39-50.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Rick Lumadue
Email this contact: 
proflumadue@gmail.com
Effective Practice Contact 2: 
Rusty Waller
Email contact 2: 
rusty.waller@tamuc.edu
Award Winner: 
2013 Sloan-C Effective Practice Award
Author Information
Author(s): 
Kelvin Thompson, Ed.D.
Author(s): 
Baiyun Chen, Ph.D.
Institution(s) or Organization(s) Where EP Occurred: 
University of Central Florida
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

The faculty development programs, instructional designers, and individual teaching faculty of the University of Central Florida have found affordances in integrating into their work the online teaching practices codified in the Teaching Online Pedagogical Repository (TOPR). The faculty development programs, instructional designers, and individual teaching faculty of other institutions can just as readily benefit from integrating TOPR entries into their work as enhancements to existing faculty development strategies. TOPR is freely available online under the terms of a Creative Commons BY-NC-SA 3.0 unported license at http://topr.online.ucf.edu.

Description of the Effective Practice
Description of the Effective Practice: 

The University of Central Florida (UCF) is one of the fastest-growing universities in the country, currently ranked as the second-largest public institution in the US with approximately 60,000 students. To meet students’ needs, over 30% of the university's student credit hours are generated by online and blended courses and nearly three-fourths of all UCF students take one or more online courses every year. As a result, the need for faculty development for online teaching has been increasing in recent years. The Center for Distributed Learning (CDL) at UCF provides a variety of faculty development offerings to meet these needs, including semester-long training programs, webinars, individual instructional design consultations, self-directed learning objects and others. It is a challenge to keep the professional development materials updated and streamlined. Further, as the number of individual faculty teaching online and blended courses at UCF and the associated number of instructional designers serving them has grown, it has been challenging to identify and disseminate emergent effective teaching practices. One initiative, the Teaching Online Pedagogical Repository (TOPR), is an effort to solve these challenges: http://topr.online.ucf.edu.

The Teaching Online Pedagogical Repository (TOPR) is a public resource for online faculty and instructional designers seeking inspiration from online teaching strategies that have proven successful for others. We at UCF took the learning practices that we endorsed to our faculty member in our professional programs and featured them on TOPR. These strategies get updated by collaborating contributors on a regular basis and we link to these strategies in our faculty development programs. TOPR has also become a handy resource for UCF's instructional designers to use in individual consultations with faculty or email responses. After instructors hear about TOPR, they save the resource as one of their favorite bookmarks and come back to these strategies when they need new ideas for online teaching.

In the Teaching Online Pedagogical Repository (TOPR), each entry describes a strategy drawn from the pedagogical practice of online teaching faculty, depicts this strategy with artifacts from actual courses, and is aligned with findings from research or professional practice literature. Emphasis is placed upon practices that are impactful and replicable. TOPR entries are tagged with relevant keywords to aid discovery of relevant content. Additionally, site visitors may also find entries by searching or by browsing a topical index. The index of published teaching practices from TOPR is available at: http://topr.online.ucf.edu/index.php/Pedagogical_Practice.

The Teaching Online Pedagogical Repository (TOPR) is offered within a wiki which makes contribution and collaboration very easy, and all entries are provided as open resources under the terms of a Creative Commons license. Thus, faculty development programs, instructional designers, and faculty from other institutions can readily adopt and adapt TOPR entries for their needs.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

Specific entries from the Teaching Online Pedagogical Repository (TOPR) are linked to from within UCF’s internal faculty development materials (e.g., LMS-based materials for UCF’s award-winning IDL6543 faculty development course) and external resources (e.g., the publicly accessible http://BlendedLearningToolkit.org site presented by UCF and the American Association of State Colleges and Universities). UCF instructional designers report sharing resources from TOPR routinely in consultations with teaching faculty. Anecdotally, some individual instructors have noted consulting TOPR for ideas. However, it is perhaps more telling to look at some of the evidence that has emerged outside of the UCF context.

The Teaching Online Pedagogical Repository (TOPR) was presented to a national audience for the first time at the 2011 Sloan-C ALN Conference. The theoretical underpinning and development background were presented in that presentation and may be reviewed at: http://ofcoursesonline.com/?p=132. Since then, promotion of TOPR has continued, and an editorial board comprised of leaders in online and blended learning from the US and Canada has been formed. (See http://topr.online.ucf.edu/index.php/Board.) The following evidence of TOPR use has emerged since that time.

While the exact number of institutions and individual faculty connecting to TOPR is infeasible to determine, it is clear that TOPR is proving useful beyond UCF. For instance, some other institutions include links to TOPR in their online resources. (See http://www.wabashcenter.wabash.edu/resources/teach-web-result.aspx?pid=3..., http://teach.granite.edu/?p=8983, and http://edtech.uvic.ca/edci335/wiki.) The statistic page for one custom url for one TOPR entry reveals that this entry has been accessed hundreds of times from multiple countries. (See https://bitly.com/discussion_rubrics+.) The easily citable TOPR entries have even appeared in research articles (e.g., http://www.westga.edu/~distance/ojdla/winter154/eskey_schulte154.html) and at least one dissertation (i.e., http://ufdcimages.uflib.ufl.edu/UF/E0/04/43/67/00001/JOHNSON_M.pdf).

As of August 2013, the most popular of the 33 public TOPR entries (e.g., related to discussion rubrics, social networking, and discussion prompts) have each received tens of thousands of page views. (See http://topr.online.ucf.edu/index.php/Special:Statistics.)

The evidence above would seem to suggest that the practice of leveraging the online teaching practices codified in TOPR for use in faculty development materials, instructional designer consultations, or individual instructor inquiry is a practice that is both replicable and potentially effective in supporting the Sloan-C pillars.

How does this practice relate to pillars?: 

Integrating practices from the Teaching Online Pedagogical Repository (TOPR):
1) Enables scale by allowing more online instructors and instructional designers to learn about effective online strategies.

2) Increases access by providing an open access online compendium for faculty development. Other institutions can link to TOPR in their professional development programs; instructional designers can recommend strategies to instructors with concrete examples; instructors can also use TOPR as a just-in-time resource whenever they need new strategies for their classroom.

3) Improves faculty satisfaction by providing faculty with open access to professional development resources that they can use in their daily classroom teaching. Since each strategy includes a detailed description and artifacts to support how the strategy is used in real classes, instructors can easily replicate these strategies in their own teaching.

4) Improves learning effectiveness by providing instructors with effective online strategies that are supported by literature.

Equipment necessary to implement Effective Practice: 

There are no extraordinary equipment costs associated with this practice. UCF maintains the Teaching Online Pedagogical Repository (TOPR). Contributors offer entries under the terms of a Creative Commons BY-NC-SA 3.0 unported license. Other institutions, instructional designers, and instructors can use TOPR with computer and internet access.

Estimate the probable costs associated with this practice: 

Costs associated with replicating this practice are negligible and equate to the opportunity costs of one's time in searching for practices of relevance within the Teaching Online Pedagogical Repository web site and applying them to one's work.

References, supporting documents: 

See attached and the links included within the evidence section.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Kelvin Thompson, Ed.D.
Email this contact: 
kelvin@ucf.edu
Effective Practice Contact 2: 
Baiyun Chen, Ph.D.
Email contact 2: 
baiyun.chen@ucf.edu
Author Information
Author(s): 
Dr. Suzanne Minarcine
Author(s): 
Dr. Jill Fuson
Institution(s) or Organization(s) Where EP Occurred: 
American Public University System
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

Faculty members in traditional institutions have long provided opportunities for student collaboration, particularly with graduate students. This has been part of the traditional graduate culture. In the School of Business at APUS, the need was identified for faculty/student collaboration in a supportive and nurturing environment. A multi-faceted approach is currently being used to successful encourage faculty research with student participation. The APUS Research Special Interest Group is rapidly growing and its positive results are evident.

Description of the Effective Practice
Description of the Effective Practice: 

A four prong approach is used to facilitate faculty involvement in research, as well as student partnerships with faculty. The cornerstone of the APUS Research Special Interest Group is the monthly Adobe Connect meeting, facilitated by Dr. Suzanne Minarcine and open to faculty and staff throughout the university. Supporting this group is a Facebook page and Linked-In group, as well as the APUS Faculty Research Grants. Faculty are encouraged to invite interested students to also participate. The monthly meetings provide an opportunity for faculty members to present their research ideas, obtain advice on methodology or research questions, gain helpful critiques of their research, and as a forum to showcase their papers and presentations. It is also provides an opportunity for novice faculty researchers and select students to gain valuable research experience by partnering with experienced faculty researchers. The APUS Research SIG Facebook page provides links to current calls for papers in a variety of disciplines.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

There have been no formal studies to date on the effectiveness of the APUS Research Special Interest Group, however membership has grown to over 250 participants from across the university. There are currently 13 proposals that have been or are being written since the beginning of this group that will be submitted to the IRB within the next 60 days.

How does this practice relate to pillars?: 

The APUS Research Special Interest Group relates to Faculty Satisfaction, Learning Effectiveness, and Student Satisfaction.

Online faculty often report feelings of isolation and a lack of support for research activities. The APUS Research Special Interest Group offers the opportunity for collaboration and support in research activities which benefit both the university and the individual faculty member. Participating faculty report a greater feeling of connectedness with other faculty and a great sense of satisfaction from the recognition.

The APUS Research SIG offers students valuable research experience and the opportunity to develop 1:1 relationships with faculty mentors. Students report higher satisfaction because of the closer relationship and interdependence that develops, and have reported overall improvements in their writing and study skills. Participation in the group has resulted in job opportunities for students who have participated.

Equipment necessary to implement Effective Practice: 

Adobe Connect, telephone access.

Estimate the probable costs associated with this practice: 

Minimal.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Dr. Suzanne Minarcine
Email this contact: 
sminarcine@apus.edu
Effective Practice Contact 2: 
Dr. Jill Fuson
Email contact 2: 
JFuson@apus.edu
Author Information
Author(s): 
Dr. Susann Rudasill, Director, FSU Office of Distance Learning
Author(s): 
Dr. E Shen, Asst. Dir., Instructional Development
Author(s): 
Dr. Annette Jones, Instructional Development Faculty
Author(s): 
John Braswell, Instructional Development Faculty
Author(s): 
FSU Office of Distance Learning Instructional Development Unit
Institution(s) or Organization(s) Where EP Occurred: 
Florida State Univestity
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

The submission by Florida State University (FSU) discussed the use of workshops to assist faculty in design and development of their online courses. It is a quality-centered approach. Specifically, we adopted Quality Matters (QM) as the evaluation criteria for online courses and applied it to workshop topics. Examples of QM applications such as workshop content, syllabus checklist, course introduction video, and course template are provided in the submission. As a result of these practices, we received high faculty satisfaction scores and increased the quality of the online courses.

Description of the Effective Practice
Description of the Effective Practice: 

Background
Online learning has a significant presence at Florida State University (FSU). As one of the leading public universities in Florida, FSU offered 495 online courses in the 2012-2013 academic year. Over 19,000 students took at least one online course during this time frame. The Office of Distance Learning (ODL) at FSU is responsible for design, development, quality assurance, and administration of online courses offered at the university. The process for developing an online course begins when faculty submit a proposal for online course development. Once the proposal is accepted, an instructional development faculty from ODL is assigned to the project and serves as both consultant and project manager to assist in the course development which can occur over multiple semesters. The demand for online course development support has been increasing. For example, in Summer 2013, there were 23 online course development projects supported by five instructional development faculty members from ODL. In the course development process, department faculty usually goes through a series of workshops to be trained on the necessary knowledge and skills to design and develop online courses. In that aspect, the workshops play a key role to ensure the success of online course development.

Four workshops were created to train faculty to develop online courses. The workshop series covers the basics of online course design focusing on quality standards. Topics cover the strategies and best practices in aspects of online course design, online course organization and content delivery, online communication and engagement, and online course assessment and evaluation. Lab time and Blackboard (the LMS FSU uses to deliver online courses) workshops are offered focus on the best uses of technology and course tools. ODL delivers these workshops in a face to face and online blended format.

Quality is the center of the workshop series. Specifically, we adopted Quality Matters (QM) [See attached Quality Matters Rubrics] as the evaluation criteria for online courses and connected it to workshop topics. The following is a list of examples that show how the workshops and associated resources have incorporated quality standards.

1. The content of the workshop

Faculty course development workshops are at the core of our efforts to insure quality in FSU’s online courses. Once a proposal for online course development has been approved, the faculty are invited and encouraged to participate in a series of workshops based on the Quality Matters rubric. Each theme addresses the different standards of the QM rubric. The main themes of the workshops are:

  • Online Course Design (QM Standards 1. Course Overview and Introduction and 7. Learning Support)
  • Online Course Organization and Content Delivery (QM Standards 2. Learning Objectives, 4. Instructional Materials, 6. Course Technology and 8. Accessibility)
  • Communication and Engagement (QM Standard 5. Learner Interaction and Engagement)
  • Assessment and Evaluation (QM Standard 3. Assessment and Measurement)

Specific topics within these themes vary based on the experiences and needs of participating instructors. Faculty are surveyed prior to the workshop to gage their knowledge and experience with online pedagogy and technology. The information from the survey along with information gleaned from course quality reviews of other online courses form the basis for the topics selected. For example, ODL instructional development faculty have noted that instructors have the most difficulty with module level objectives and organization, strategies for online interaction, media selection and development, and creating rubrics. As a result, these topics receive more in-depth discussion.

The sessions involve a mix of online and face-to-face instruction and participation. All participants are enrolled in a workshop course site that contains a variety of resources to assist them in course development as well as instructional modules based on the workshop themes. Each module contains activities such as readings and discussion board posts that participants complete before and after each session. The “homework” is intended to help faculty build their course sites while also experiencing the role of student in the online environment.

The face-to-face workshops contain a mix of presentations, discussion, and active learning so that participants have a chance to “DO” something in their course. The workshops are then followed by an optional lab time in which the participants can work on their courses with the one-on-one assistance of the ODL development team consisting of members of the instructional development faculty, media group, & Blackboard support.

Online instructors can also participate in separate Blackboard workshops that are designed to reinforce the concepts and strategies discussed in the development workshops and offers further opportunity to practice using technology.

2. The syllabus checklist

The Office of Distance Learning has created a number of additional resources based on feedback from our online instructors. The new resources are intended to help instructors incorporate quality standards while easing the sometimes significant workload of course design. These resources are utilized in the workshops but are also designed to serve as stand-alone materials. The FSU standard syllabus template was modified to incorporate elements of the QM standards as well as other items unique to online learning (See attached FSU QM adapted syllabus checklist). In feedback, faculty mention that they appreciate having a checklist that helps them easily identify necessary changes to their syllabus while also checking off elements of the QM rubric.

3. The Introduction video
A video is produce to introduce department faculty how to build effective course introduction video. It covers varies aspect of quality mattes standards [QM1.1, 1.2, 1.7, 2.4, 4.3, 4.6, 5.2, 5.4, 7.1]. In the course intro video, department faculty are guided to place the key information related to course overview and introduction, learning objectives, instructional materials, learner interaction and engagement, and leaner support in their course instruction video. A questionnaire is also designed to help instructor to put all the information together. Please click on the link below to view the video.
http://distance.fsu.edu/videos/instruction/creating-course-introduction-videos.php

4. The Blackboard course template
FSU uses Blackboard as the major delivery tool for online courses. Faculty often struggle to organize their online materials in a way that is meaningful to students. Also, ODL noticed that some QM standards were lacking in courses and considered how the course design process could be made a little easier for faculty by providing templates. A “Course Master” shell was created specifically for online courses. Course development sites are created for faculty with the online shell which faculty can then edit. Here are some example features of the course template that connect to the QM standards.

The course introduction page

Each site has a start-up page with introduction video, course descriptions, course objectives, course pre-requisites, course materials, and instructor information [QM 1.1]. A “get started page wizard” was also developed to facilitate the building of a course welcome page. (http://distance.fsu.edu/docs/getstarted/) Please see attached screen shot of “start-up” page.

The lesson module page

A template is set up for each lesson module inside Blackboard. The lesson starts with a standard “introduce yourself” discussion board activity (see Screen shot of “lessons template page”). Each lesson unit is in a folder with learning objectives, reading, lectures, assignments, and additional resources accessible within that folder (Screen shot of “individual unit template page”) [QM 2.2, 6.3, 6.4]. Please see attached screen shot of “lessons template page” and screen shot of “individual unit template page”

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

Evidence 1: Faculty workshop satisfaction survey
We received high marks from the workshop satisfaction survey as evidence of the successful faculty workshop. The following table shows the result of the workshop satisfaction survey. Below is list of percentage of answers of agree or strongly agree for each faculty satisfaction survey questions. The participants count is 47.

1. The session enhanced my professional skills and knowledge on instructional development. -95.74%
2. The goal and objective of the session were met.- 97.87%
3. The content was accurate and current. - 100.00%
4. Resource information provided was adequate and useful. - 93.62%
5. The time allocated was adequate for the material presented. - 82.98%
6. The technology or audio-visuals used was appropriate for conveying the information. - 89.36%
7. The speaker(s) was effective in conveying the information. - 95.74%
8. Participants were given adequate opportunity to ask questions. - 93.62%
9. Overall, the session met my needs. - 91.49%

Evidence 2: QM review Score
We conducted preliminary QM reviews (only the standards list as 3 points in the QM rubrics) for the courses ODL sponsored design and development. The average QM scores of these courses climbed over the past three semesters. The result of the Summer 2013 developed courses has not come out yet. We expect the average QM score to maintain upward trend. Below is a list of semester and average QM score of the courses finished development in that semester.

Summer 2012 - 51 out of 63 (80.95%)
Fall 2012 - 54 out of 63 (85.71%)
Spring 2013 - 55 out of 63 (87.30%)

How does this practice relate to pillars?: 

Faculty Satisfaction – As is evident in the faculty satisfaction survey, faculties who design and develop online course prefer to have clear guidance on how to design an effective online course. We received very high faculty satisfaction scores when we incorporate QM standards into the workshops using various templates, checklist, and best practice demonstrations.

Scale – As a nationally recognized quality rubrics for online course, Quality Matters (QM) is used by a lot of universities and community colleges to control the quality of their online courses. The templates, checklist, and best practice video presented in the submission can be easily implemented by other schools in their faculty workshops with no or very lost cost.

Learning Effectiveness & Student Satisfaction – Good online course design is the foundation of effective online course delivery and better student satisfaction. In that sense, ensure quality of online course in design and development stage is a necessary step toward learning effectiveness and better student satisfaction.

Equipment necessary to implement Effective Practice: 

Not applicable.

Estimate the probable costs associated with this practice: 

No or very low cost.

Other Comments: 

Please check the copyright of attached quality matters rubrics before publish it to general public.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Dr. E Shen, Asst. Dir., Instructional Development
Email this contact: 
eshen@campus.fsu.edu
Award Winner: 
2013 Sloan-C Effective Practice Award
Collection: 
Student-Generated Content
Author Information
Author(s): 
Allison P. Selby
Author(s): 
Julie Frieswyk
Institution(s) or Organization(s) Where EP Occurred: 
Kaplan University
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

A virtual internship program forged international connections between a Peace Corps volunteer, a faculty member and students at Kaplan University, School of Information Technology. Virtual internships and international partnerships provide high-impact experiential learning opportunities for students while providing means for non-governmental organizations (NGOs) to build capacity and cultural bridges. This type of program allows non-traditional adult students in particular to maintain their family responsibilities and to continue their full time jobs while working on projects overseas in an online capacity. This program has led to increased student confidence in their skillsets as they continued to develop their assigned projects for the NGO. They also gained exposure to cultural diversity and international collaboration atypical of your average IT class.

Description of the Effective Practice
Description of the Effective Practice: 

Project iNext exemplified an institutional partnership between Kaplan University, Information Technology School and a Peace Corps Volunteer (PCV), Julie Frieswyk. Julie reached out to Kaplan University on behalf of her partner non-governmental organization (NGO), Pro-Business Nord (PBN), located in the Republic of Moldova. PBN is directly funded by United States Agency for International Development (USAID). One of the key goals of PBN was to develop a new social enterprise model for a sustainable Women Career Development Program in the northern part of Moldova.

The virtual internship program of Kaplan University was implemented to connect Information Technology students with the NGO. The partnership goals were to gain expert advisory in updating the older versions of their NGO website, testing server security and help to develop a new website for PBN’s new social enterprise, ProBizNord, a regional Business Resource Center.

The partnership with Pro-Business Nord (PBN) in Moldova was led by Allison Selby, Kaplan Information Technology Faculty and Julie Frieswyk, Peace Corps Volunteer. Frieswyk ensured the internship project goals were in alignment with the priorities of PBN and Peace Corp goals. Selby ensured the weekly outcomes were being met by the students and the students were receiving the necessary assets to complete their assigned tasks. This partnership was also important for the very practical concern of language translation. While the PBN team did speak English well, Frieswyk was also able to translate Russian to English as necessary.
The project provided excellent opportunities for students to apply their knowledge, skills and abilities in an authentic context. They were exposed to negotiating schedules, timeframes, project outcomes and clearly communicating the assets needed to progress to subsequent stages of the project. Students were able to participate in conversations that quickly became a mix of Russian and English, spanning multiple time zones, and developing materials for people they discovered they had much in common with. The exposure to cultural diversity, businesses and lifestyles was greatly appreciated by the students.

At the end of the ten week experience, two students exceeded expectations and one student did not perform per expectations. Two fully functional websites were developed and met the requirements of PBN. The students were able to apply new skills for the site development and learned the process of client interaction, requests for revisions and practiced final presentation skills. The third project involved conducting security forensics, which were never fully completed. Many factors could be attributed to this outcome. Conducting security forensics as a team may be more effective, having a mentor with strong expertise and practice in forensics would be an asset and providing projections of some of the possible testing outcomes would have provided a stronger set of parameters for experimentation. The fact that one of these projects was not wholly successful was actually just as valuable to us as we continued to evaluate the program.

The overall outcome included engaged students with opportunities to gain authentic work experience and international exposure. PBN received considerable student-conducted training with the platform Wordpress, marketing skills including Search Engine Optimization (SEO), and practice in project planning and implementation. The Moldovan NGO gained exposure to more skills and up-to-date technology, building their own capacity, while continuing building cultural bridges through their experience with the Peace Corps. They also became co-educators of the students (Holland, 1997), while the students learned how to professionally interact, accept constructive criticism, and design for the clients’ aesthetic taste rather than their own.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

This international partnership resulted in a small sampling of student participation and as such, my evidence is largely anecdotal and based upon student and NGO team feedback. The students remarked this was a unique international opportunity to learn the process of web development for clients, working together with international clients reviewing risks and suggestions, and to experience real-world project management. The NGO loved to be a part of something innovative and to learn more about our school system. The skills transfer and global understanding were repeated themes that appeared in the feedback and discussions.

As a high-impact experiential activity, the student and NGO partnership provides a type of global community based opportunity for the students’ worldview and perception to transform (Cress, 2004). The NGO benefits from the partnership by gaining access to resources and networks (Ferman & Hill, 2004) while collaborating with the students to build social change. The ‘mutually beneficial agenda’ (Holland, 1997), collaborative effort and shared gains of knowledge and practice becomes a transformative relationship (Bushouse, 2005), which in turn provides a sense of purpose to motivate student engagement and learning (Colby & Sulivan, 2009).

How does this practice relate to pillars?: 

It relates to pillars due to the learning gains made by the students as evidenced by implementing and customizing the Wordpress platform, participating in professional dialogues with the partners, demonstrating project management skills to stay on task, and interacting with a culturally diverse team. The student survey feedback stated this experience was not something they could experience in a typical classroom and they gained confidence and increased abilities throughout the process. They completed the program knowing they did possess professional skills in an authentic context.

Virtual internship partnerships could involve studies on social entrepreneurship, micro-finance, marketing, business administration and design. The virtual internships creates problem-solving activities with the potential to result in real-world skills such as collaboration for problem-solving, technology proficiency, presentation skills, and a greater appreciation for intercultural diversity (Humphreys, 2009). This opportunity provides students with an international experience who may otherwise be limited by finances, work responsibilities, family obligations or physical limitations. In addition, there is a considerable cost-savings when compared to studying abroad for the same amount of time. A virtual internship program incurs regular tuition fees, no additional costs are required by the student.
Students enjoyed the experience overall and loved the new addition to their resume and credentials. We learned a lot about how to support the students more efficiently. This type of project benefits tremendously by considerable advanced preparatory stages. Using project charters to outline weekly outcomes and deliverables is very important. Defining the exact scope of the deliverables, what assets may be needed and the key stakeholders were all important topics to clarify. Synchronous weekly team meetings using Skype with the clients gave the students a vested interest and motivation to succeed. And having the students train the clients for site maintenance gave them ownership of the process and pride in their proof of success. It was exciting, engaging, and could definitely be accomplished by other institutions with great success.

Equipment necessary to implement Effective Practice: 

The only aspect completely necessary is an internet connection and email. In our program, the students also used Captivate for creating videos to present the finished products and instructional materials for the clients. Jing would be a reasonable free alternative for short presentations under five minutes. The students also used Wordpress and installed the framework on the client server. The students used free themes for both Wordpress sites.

All other tools enhance the experience and few have costs associated with them. We recommend the following:

• Synchronous tools: Adobe Connect, Google Hangouts, Google chats, Skype
• Asynchronous tools: email, discussion board in LMS
• Reflective tools: Blog, journals, status reports

Estimate the probable costs associated with this practice: 

The only additional cost would be optional and would involve the use of Adobe Connect. All other resources were open source and we did not incur additional costs using them. The client already had server space and the students used free Wordpress themes. There was essentially no budget for the site so our costs were very minimal for this project.

References, supporting documents: 

Bushouse, B. K. (2005). Community Nonprofit Organizations and Service-Learning: Resource Constraints to Building Partnerships with Universities. Michigan Journal of Community Service Learning, 32-40.
Colby, A., & Sulivan, W. M. (2009). Strengthening the foundations of students’ excellence, integrity, and social contribution. Liberal Education, 22-29.
Cress, C. M. (2004). Critical thinking development in service-learning activities: Pedagogical implications for critical being and action. Inquiry: Critical Thinking Across the Disciplines, 87-93.
Cuban, S., & Anderson, J. B. (2007). Where’s the Justice in Service-Learning? InstitutionalizingService-Learning from a Social Justice Perspectiveat a Jesuit University. Equity & Excellence in Education, 144-155.
Ferman, B., & Hill, T. L. (2004). The challenges of agenda conflict in higher-education-community research partnerships: Views from the community side. Journal of Urban Affairs, 241-257.
Holland, B. (1997). Analyzing institutional commitment to service: A model of key organizational factors. Michigan Journal of Community Service Learning, 30-41.
Humphreys, D. (2009). College outcomes for work, life,and citizenship: Can we really do it all? Liberal Education, 14-21.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Allison Selby
Email this contact: 
aselby@kaplan.edu
Effective Practice Contact 2: 
Julie Frieswyk
Email contact 2: 
juliefrieswyk@gmail.com
Author Information
Author(s): 
Carol A. McQuiggan, D.Ed., Manager & Senior Instructional Designer, Penn State Harrisburg
Author(s): 
Laurence B. Boggess, Ph.D., Director of Faculty Development and Support, World Campus/Academic Outreach
Author(s): 
Brett Bixler, Ph.D., Lead Instructional Designer, IT Training Services
Author(s): 
Wendy Mahan, Ph.D., Senior Instructional Designer, The College of Health and Human Development
Institution(s) or Organization(s) Where EP Occurred: 
The Pennsylvania State University
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

The Faculty/Staff Engagement and Professional Development subcommittee is composed of faculty, faculty developers, and learning designers from throughout the University, representing multiple colleges, campuses, and support units. They collaboratively identify, complete, and disseminate projects that have the potential to promote excellence in online teaching and learning, to increase faculty interest in online teaching activities, and to pursue collaborative endeavors within and outside the university to continue to build a strong foundation for faculty engagement in online teaching. This unique, cross-campus, interdisciplinary, and multi-unit approach provides multiple perspectives and addresses common needs in providing quality online teaching and learning experiences.

Description of the Effective Practice
Description of the Effective Practice: 

The Faculty/Staff Engagement and Professional Development subcommittee sits within a Penn State Online structure composed of the Penn State Online Steering Committee and the Penn State Online Coordinating Council. Understanding the structure is important to putting this effective practice into the context of a large multi-campus university, while also understanding that this practice could be implemented in an institution of any size.

The Penn State Online Steering Committee serves as the governing body for Penn State Online, reporting to the Provost. The Steering Committee has strategic leadership responsibility for Penn State Online, serves as the policy board for the e-Learning Cooperative and the World Campus, and as the governing board for the Penn State Online Coordinating Council, through which it encourages effective cross-unit coordination of several key functions. These key functions include the effective use of course development resources, professional development, establishment of standards, and innovation and research.

The Penn State Online Coordinating Council reports to the Steering Committee. It includes representatives of the key central University units involved in e-learning—Teaching and Learning with Technology, Undergraduate Programs, University Libraries, and the World Campus—and college-based e-learning development and support units. It meets bimonthly to develop University-wide best practices, standards, and procedures that will facilitate the growth of high-quality e-learning at Penn State.

The Coordinating Council is responsible for identifying opportunities for collaboration, promoting the effective coordination of resources across organizational units to achieve synergy and create capacity to address strategic priorities, and developing common standards to guide work across units. In some cases, the Council responds to requests from the Steering Committee; in other instances, the Council identifies issues and proposes remedies to the Steering Committee; and in other situations, the Council addresses operational issues and simply reports the results to the Steering Committee.

The Faculty/Staff Engagement and Professional Development subcommittee reports to the Coordinating Council at its bimonthly meetings. Its completely volunteer membership of faculty, faculty developers, and learning designers, includes representation from six campuses, nine colleges, and three support units, and all have some responsibility for and/or interest in online teaching and learning. It is co-chaired by the director of World Campus Faculty Development, maintaining a direct relationship with Penn State’s online campus.

Project ideas trickle down from the Steering Committee and the Coordinating Council, and also trickle up from the needs and practices of the subcommittee members and the faculty and administrators with whom they work. Using a team approach, the projects are designed for wide use and adaptability across the University and beyond.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

Resources developed by the Faculty/Staff Engagement and Professional Development subcommittee include the Penn State Quality Assurance e-Learning Design Standards, Hiring Guidelines for Online Program Lead Faculty, Online Course Authors, and Online Course Instructors, the Faculty Self-Assessment for Online Teaching, Peer Review for Hybrid Courses, and Faculty Competencies for Online Teaching. All are accessed via the Weblearning site (http://weblearning.psu.edu), and selecting the “Resources” tab for “Penn State Online Resources” (http://weblearning.psu.edu/resources/penn-state-online-resources/).

Penn State Online has adopted the Quality Assurance e-Learning Design Standards (http://weblearning.psu.edu/resources/penn-state-online-resources/quality...), providing a measure of quality assurance for online courses in order to best serve the e-learning needs of our students. For each of the twelve standards there is a link to a short description of the standard, a list of the required evidence that the standard has been met, suggested best practices, and resources to learn more.

The Hiring Guidelines (http://weblearning.psu.edu/resources/penn-state-online-resources/hiring-...) are used to help guide the hiring process for online program lead faculty, course authors, and course instructors. The subcommittee is currently using these guidelines to suggest interview questions to accompany each guideline document in the near future.

The Faculty Self-Assessment for Online Teaching (https://weblearning.psu.edu/FacultySelfAssessment/) tool was packaged as open source and licensed with Creative Commons in order to share as broadly as possible. The tool was the result of a literature review, focus group input, and usability testing, and was vetted at a well-attended Sloan-C Conference presentation. To date, the tool has been shared with over thirty colleagues representing academies, community colleges, state colleges, and universities throughout the United States. It has also been shared with three doctoral students for potential use in their dissertations. We are hoping that all of our tools can be shared as broadly.

The Faculty Competencies for Online Teaching (http://weblearning.psu.edu/resources/penn-state-online-resources/faculty...) were derived from research conducted by a team at Penn State’s World Campus, which the subcommittee used to develop a document detailing those competencies alongside additional guidelines, examples, and best practices. They provide faculty and administrators with a better understanding of the instructional requirements of online teaching.

The Peer Review for Hybrid Courses (https://www.e-education.psu.edu/facdev/hybridpeerreview) is based on the “Seven Principles for Good Practice in Undergraduate Education,” a summary of fifty years of higher education research that addressed good teaching and learning practices. This process was designed, implemented, and assessed by the subcommittee based on a need shared by the campus learning designers and faculty.

The Web Learning @ Penn State (http://weblearning.psu.edu) site continues to evolve and grow, with a newly revised site just launched on August 29th. A member of the Faculty/Staff Engagement and Professional Development subcommittee is a contact for each of the site’s webpages.

Together, these resources, collaboratively designed and shared broadly, have provided access to tools that increase the quality of our online courses. They identify and share best practices for online teaching and learning, identify and share competencies for online teaching success, and establish and share guidelines for creating quality online courses and hiring qualified instructors.

How does this practice relate to pillars?: 

Learning Effectiveness: The effective practices supported by the subcommittee in the area of learning effectiveness are most evident in the Faculty Self-Assessment for Online Teaching, which stresses the skills needed by online instructors to be effective online teachers. This will be enhanced when the redesign of the tool is completed to align with the Faculty Competencies for Online Teaching. The Competencies also provide an opportunity for faculty professional development, as do the Peer Review tools. Two current projects, the New Instructor Orientation to Online Teaching Checklist, and the New Faculty Manual, will contribute to faculty development and the core elements of quality online learning.

Scale: Some of our newest tools contribute to the scale pillar. The Checklist for Administrator Review and Approval of Online Courses and the Course Revision Worksheet both contribute to continuous improvement. The Checklist for Administrator Review builds administrative awareness of the scope of faculty authoring of online courses, and how new courses fit within a program of study. It also addresses the need for a course to not be dependent upon one faculty member to teach, addressing the need for faculty capacity. The Course Revision Worksheet creates more awareness as to the personnel who need to be involved in a revision, and the overall scope of work required.

Access: The “Resources” (http://weblearning.psu.edu/resources/penn-state-online-resources/) the Faculty/Staff Engagement and Professional Development subcommittee have built and provided on the Web Learning @ Penn State (http://weblearning.psu.edu) site provides the Penn State University community access to resources that promote best practices in online learning and sets standards for excellence in hiring online faculty. Because it is a public site, access is also provided more broadly beyond the University. By building and providing these tools broadly, all e-learning design units at Penn State have access to the same tools which overlap in their communication of quality standards, providing students with online courses that are designed with the same quality framework. We hope to learn more about how these units are using these tools and, even more important, learn how they are impacting design considerations. Then we would like to learn how those design considerations are impacting student learning and their access to quality online courses.

Faculty Satisfaction: Our subcommittee and the broader committee structure we are nested in serves the “support” and “institutional study/research” aspects of the institutional factors related to faculty satisfaction. We provide all of the institutional supports in a unique, cross-campus, interdisciplinary, and multi-unit approach. We provide opportunities for innovation by asking online instructors to engage in self-improvement. As more and more instructors teach online and more administrators are responsible for hiring and developing them, we need a way to ensure self-learners have optimal materials at their disposal for just-in-time learning. The tools we create and disseminate do this.

Equipment necessary to implement Effective Practice: 

There is no special equipment necessary to implement this Effective Practice. The Faculty Engagement subcommittee uses the resources already available within their various units. The Web Learning @ Penn State (http://weblearning.psu.edu) site is used for broad dissemination.

Estimate the probable costs associated with this practice: 

There are no direct costs associated with this practice. The indirect costs are the time the subcommittee members spend on specific projects, but they are offset by the opportunities they provide for their own professional development and by the tools they create that afford new efficiencies and quality processes. No one person or unit could have created these tools alone, but by collaborating across units, common needs are being met with resounding success.

References, supporting documents: 

As the transition is made to our newly designed website (http://weblearning.psu.edu), we plan to gather web analytics on its traffic. We are also planning to survey the various Penn State eLearning design units as to their use of the various tools, both to increase awareness and to learn of implementations. Within implementations, we hope to dig more deeply to learn about the effectiveness and transformative possibilities of the various tools. If possible, we would even like to link design and teaching/learning decisions made based on tool use, and improvements in student learning. We will identify research partners and submit proposals to Penn State’s Center for Online Innovation in Learning (COIL).

A number of our tools support and/or are aligned with the research-based "Competencies for Online Teaching Success" (http://sites.psu.edu/wcfacdev/2013/05/15/competencies-for-online-teachin...).

Other Comments: 

New projects marinate in the subcommittee and extend into the University and beyond just as the Faculty Self-Assessment for Online Teaching tool did. This is additional evidence that the subcommittee has a proven track record of innovation, honing to best practices, and then generous dissemination - traits Sloan-C supports.

Projects that have been completed and will be added to the Web Learning site very soon:
1. The Checklist for Administrator Review and Approval of Online Courses was created to guide an administrator through an initial review of an online course that has been developed by a faculty member from their unit in collaboration with a learning designer. It is yet another tool to ensure quality review; in this case, by program administrators. It gives the administrators a checklist of items to review, and a feedback loop with the learning designer and faculty author.
2. The Course Revision Worksheet is intended for use by course development teams to communicate the reasons for a course revision, the specific course items in need of revision, the percentage of revision needed for each course item, the personnel who need to be involved in those revisions, and the total percentage of effort that will be required. This information can then be used to assign the appropriate resources to the course revision project. This is also a learning document in that it creates an awareness of the needs for revision and the effort required by a potential team of people.

Our work continues on these current projects:
1. Asteria - This will be a decision support tool for faculty to use to match pedagogical purpose with an educational technology. Two approaches for use are planned. Users may approach this tool with an unfocused search in which they simply want to explore different tools. There will also be a focused approach available in which faculty who know what they want to do pedagogically can search for the appropriate tool. The intent is to have the focus on the user’s pedagogical purpose, and not be tool-driven.
2. An update of the Faculty Self-Assessment for Online Teaching tool to align with the Faculty Competencies for Online Teaching - The current tool will be revised to align with the Faculty Competencies for Online Teaching.
3. New Instructor Orientation to Online Teaching Checklist - A number of Penn State’s eLearning units have a new instructor orientation. They are comparing their orientations in order to develop a checklist of core elements to share as a basis for different units to use and adapt for their own use. This will also be able to be used for new units developing their own orientation. The checklist end users will most likely be learning designers and faculty developers.
4. New Faculty Manual - The Faculty Manual will provide faculty new to online teaching with a comprehensive manual to which they can refer as they are teaching online. The team will use the World Campus faculty manual to create a common manual for faculty that can be adapted by individual units.
5. Online Mentoring Program Pilot - Through a partnership with the Schreyer Institute for Teaching Excellence, a mentoring program is being developed to provide instructional support for those teaching online, and to create opportunities for networking with others teaching online. They are reading about the Community of Mentoring practices, and are putting a research lens on the project as they plan to move forward.
6. New Online Faculty Interview Questions based on the Hiring Guidelines - This will be a logical companion piece to the Hiring Guidelines already available on the Web Learning site.
7. Best Practices Examples - Listening to faculty needs, the subcommittee is collecting examples of best practices (instructor introduction, discussion rubric, team project structure, various learning activities, providing effective feedback, flipped classroom design, etc.) and plans to host them on the Web Learning site for all to access and use.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Carol A. McQuiggan
Email this contact: 
cam240@psu.edu
Effective Practice Contact 2: 
Larry Boggess
Email contact 2: 
lbb150@psu.edu