1 / 40

Unlocking Possibilities: The Latest Research Findings in eLearning

This webinar presents the latest research findings on the career advancement of graduates from online leadership preparation programs. The study focuses on the impact of these programs in terms of obtaining principal certification, entering campus administration, and becoming head principals.

jfox
Télécharger la présentation

Unlocking Possibilities: The Latest Research Findings in eLearning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Unlocking Possibilities: The Latest Research Findings in eLearning December 15, 2015 Sponsored by: #APCommons

  2. We are recording the webinar; the webinar archive and slides will be forwarded tomorrow. We are using On24. Please enter questions in the text field at the bottom of the Q&A Window. Before We Begin Q & A Join the conversation on Twitter:#APCommons

  3. Welcome from Academic Partnerships (AP) Jennifer Scott, SVPAcademic Services & ProductsAcademic PartnershipsJennifer.Scott@academicpartnerships.com • AP’s mission is to help universities increase access to high-quality post-secondary education. • Consult with faculty for online course conversions • Consult with administrators on best-practice processes • Conduct marketing and recruitment on university’s behalf • Support students from initial inquiry to graduation AP succeeds only when universities and students succeed.

  4. Faculty Grant Program • Faculty eCommons is the site for AP’s Faculty Grant grant program, available to faculty teaching within AP-supported programs. • The presenters today received research grants from AP. • Since 2012, AP has awarded grants to over 90 faculty at 20+ partner universities. www.FacultyeCommons.com • Resources to assist online faculty • Ongoing professional development Discipline-specific resources • How-to guides Twitter: @APCommons LinkedIn: AP Faculty eCommons Pinterest: FacultyCommons

  5. Today’s Speakers Bradley Davis, PhD University of Texas at Arlington bwdavis@uta.edu Karen Manning University of Cincinnati manninkn@ucmail.uc.edu Brinda McKinney, PhD Arkansas State University bmckinney@astate.edu Angela Robbins University of Cincinnati robbinae@ucmail.uc.edu December 15, 2015

  6. Exploring the Career Paths of Leadership Preparation Program Graduates Bradley W. Davis, Ph.D. Assistant Professor Department of Educational Leadership and Policy Studies

  7. Motivation • Participation in online learning grows by nearly 600,000 students annually(Allen and Seaman, 2013). • Prior to 2007, enrollment in educational administration masters programs at Texas public universities increased roughly 2%, annually (Texas Higher Education Coordinating Board, 2015) • In 2008, growth over the previous year hit 32.1% • Thin knowledge base with regards to programs and delivery models

  8. Purpose • With this change in the Texas landscape, there was a sea change in institutional representation with regards to productivity. • Also with that change, came a responsibility to evaluate: • As such, the purpose of our study was to quantify the impact of online leadership preparation as measured through the career advancement of its graduates.

  9. Research Questions Regarding the accelerated, online principalship program at the University of Texas at Arlington: • What proportion of program graduates obtain principal certification? • What proportion of program graduates enter campus administration? • What proportion of program graduates become head principals?

  10. Data & Method • Looked at the accelerated, online principalship program at the University of Texas at Arlington (UTA). • Utilized institutional data from UTA and administrative data from the Texas Education Agency (TEA). • 2,555 students have ever entered the program, with 1,310 graduates and an additional 810 students still active in the program as of July 2015. • Of the 1,310 graduates, 843 lived in Texas and graduated before the most recent update to certification data. • Methods were a mixture of descriptive statistics and survival analysis.

  11. Findings 527 of the 843 graduates had obtained principal certification, or 62.51% (RQ 1). Certifications were obtained between January of 2011 and December of 2014. • This seemingly-low percentage is without a basis for comparison. Some potential explanations: • Private schools • Other administrative positions • Sense of urgency • Esteem, salary, and options • Normal or even exemplary?

  12. Findings (cont.) 18.98% of graduates with principal certification have since been employed as either an assistant or head principal (RQ 2). Further, 2.47% have already ascended to the head principalship (RQ 3). • With regards to RQ 3, the probability of becoming a head principal in Texas does not peak until six years after certification (Davis, Gooden, & Bowers, 2015). Earliest graduates of program under study completed in 2011. • Adding nuance, we wanted to account for time since graduation. To do so we pivoted to a person-period data structure and employed some simple survival analysis: • 17.70% of graduates enter administrative positions within one year of obtaining certification. • 13.25% do so in the second year after certification.

  13. Implications & Conclusion • Troubling lack of context for our findings, juxtaposed against increasing calls from the policy world to capture the impact that leadership preparation is having • Alliance to Reform Education Leadership (AREL) report (Briggs, Cheney, Davis, & Moll, 2013) • University Council for Educational Administration (UCEA) framework (Orr, Young, & Rorrer, 2010) • Much more is needed to address gaps in knowledge

  14. References Allen, I. E., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. Newbury, MA: Sloan Consortium. Briggs, K., Cheney, G. R., Davis, J., & Moll, K. (2013). Operating in the dark: What outdated state policies and data gaps mean for effective school leadership. George W. Bush Institute. Retrieved from http://www.bushcenter.org/sites/default/files/GWBI-operating%20in%20theDark_v17_web.pdf.pdf Davis, B.W., Gooden, M. A., & Bowers, A. J. (2015, November). Traversing transcultural spaces: An event history analysis of teachers’ pathways to the principalship. Paper presented at the annual conference of The University Council for Educational Administration, San Diego, CA. Orr, M.T., Young, M. D., & Rorrer, A. (2010). Developing evaluation evidence: A formative and summative evaluation planner for educational leadership preparation programs. Salt Lake City, UT: UCEA Center for the Evaluation of Educational Leadership Preparation and Practice. Texas Higher Education Coordinating Board. (2015). Degrees awarded - by institution, level, curriculum area. Available from Texas Higher Education Coordinating Board web site: http://reports.thecb.state.tx.us/approot/dwprodrpt/majmenu.htm

  15. The Yeas and Nays of Online Threaded Discussions Brinda McKinney Ph.D., MSN, RN

  16. Presentation Objectives After attending this webinar, the attendee will … 1. Identify the potential impact of threaded discussion activities on the mastery of unit content in science- and theory-based courses. 2. Consider the possibility of utilizing threaded discussion activities in their current online courses.

  17. Purpose The purpose of this study is to explore the relationship of threaded discussion board activity and examination scores in science-based and theory-based online nursing courses.

  18. Methodology • This study was a retrospective quantitative inquiry. • Population: All students enrolled in two specific online RN-BSN courses from January 2014 through May 2015. • Courses involved: One science-based course (High Acuity) and one theory-based course (Nursing Management) • Sample size: 125 for science-based course; 286 for theory-based course • Scores considered: Threaded discussion activities, weekly quiz/exam, and overall scores

  19. Hypotheses The study tested the following hypotheses: • - The null hypothesis (Ho: μ1= u2) states there is no difference in examination scores of students who do or do not actively engage in science-based course threaded discussion activities. • - The alternative hypotheses (H1: μ1≠μ2) states there is a difference in examination scores of students who do or do not actively engage in science-based course threaded discussion activities. • - The null hypothesis (Ho: μ1= u2) states there is no difference in examination scores of students who do or do not actively engage in theory-based course threaded discussion activities. • - The alternative hypotheses (H1: μ1≠μ2) states there is a difference in examination scores of students who do or do not actively engage in theory-based course threaded discussion activities.

  20. Overall Descriptive Statistics

  21. High Acuity Nursing

  22. Nursing Management

  23. Limitations • Small sample size • Study was conducted at one university in Arkansas. • Does not control for the quality of the discussion board assignment. • Limited to the two specific courses

  24. Conclusions • Weekly threaded discussion scores did not prove to be significant with regard to content mastery of that week • There was an impact from the threaded discussion board activity as a whole on the overall course grades

  25. Implications • The design of the threaded discussion matters • Threaded discussion activities should be tested to determine if they measure what you want to measure • Threaded discussion activities may be utilized in science-based and theory-based nursing courses successfully

  26. Bibliography • 1. Blackmon, S. J. (2012). Outcomes of chat and discussion board use in online learning: A research synthesis. Journal of Educators Online, 9(2), 1-19. • 2. Clarke, L. W., & Kinne, L. (2012). More than words: Investigating the format of asynchronous discussions as threaded discussions or blogs. Journal of Digital Learning in Teacher Education (International Society for Technology in Education), 29(1), 4-13. • 3. Shuyan, W. (2011). Promoting student's online engagement with communication tools. Journal of Educational Technology Development & Exchange, 4(1), 81-90. • 4. Duncan, K., Kenworthy, A., & McNamara R. (2012). The effect of synchronous and asynchronous participation on students' performance in online accounting courses. Accounting Education: an international journal 21(4), 431-449. • 5. Wei-Ying, H., Manfen, C., & Hsing-Wen, H. (2013). Assessing online discussions: Adoption of critical thinking as a grading criterion. International Journal of Technology, Knowledge & Society, 9(3), 15-25. • 6. Spector, M. J. (2005). Time demands in online instruction. Distance Education, 26(1), 5-27.

  27. Bibliography continued • 7. Daspit, J. J., & D'Souza, E. E. (2012). Using the community of inquiry framework to introduce wiki environments in blended-learning pedagogies: Evidence from a business capstone course. Academy Of Management Learning & Education, 11(4), 666-683. doi:10.5465/amle.2010.0154 • 8. McCracken, J., Sunah, C., Sharif, A., Wilson, B., Miller, J., Scalzo, D., & Crowley, C. (2011). Articulating assessment design practice for online courses and programs - Cases in assessment strategy design and development. Proceedings of the International Conference on E-Learning, 226-235. • 9. Kang, M. M., & Im, T. T. (2013). Factors of learner-instructor interaction which predict perceived learning outcomes in online learning environment. Journal of Computer Assisted Learning, 29(3), 292-301. doi:10.1111/jcal.12005 • 10. Makri, K., Papanikolaou, K., Tsakiri, A., & Karkanis, S. (2014). Blending the community of inquiry framework with learning by design: Towards a synthesis for blended learning in teacher training. Electronic Journal of E-Learning, 12(2), 183-194. • 11. Wei-Ying, H., Manfen, W.C., & Hsing-Wen, H. (2013). Assessing Online Discussion: Adoption of Critical Thinking as a Grading Criterion. International Journal of Technology, Knowledge and Society, 9(3), 15-25. • 12. Lai, K. (2012). Assessing participation skills: Online discussions with peers. Assessment & Evaluation in Higher Education, 37(8), 933-947. doi:10.1080/02602938.2011.590878

  28. Unlocking Possibilities: “Increasing Student Engagement and Positive Learning Outcomes in Online Problem Based Learning Through the Use of On Demand Video Feedback" Karen Manning, MBA Assistant Marketing Professor - Annual Adjunct Carl H. Lindner College of Business University of Cincinnati Angela Robbins, MSE, MS Senior Instructional Designer Carl H. Lindner College of Business University of Cincinnati

  29. Student Engagement Challenges – • Online Environment Prior Research • Simulating the interaction and daily dynamics of the F2F classroom environment continues to be one of the challenges for online learning education; from classroom discussions and problem based learning group interactions to face-to-face instructor feedback. (Heerma & Rogers, 2001; Revere, 2003). • Another challenge is developing social connections between faculty and students. Strong relationships between faculty and students have consistently been viewed as a primary factor in student success and satisfaction (Fabry, 2009).

  30. Solutions Prior Research • One method of developing stronger connections between faculty and students is by having individual faculty develop personal video content that can be integrated into asynchronous online courses (Knee, Musgrove, & Musgrove, 2000; National Teacher Training Institute, 2006). • Kaltura is one example of an open source video tool that can be used to accomplish this goal. • Evidence suggests that the combination of written and audio feedback can enhance student satisfaction measures in the online classroom environment (Dias & Trumpy, 2014) versus written feedback alone due in part to positive perceptions of instructors' engagement and "social presence."

  31. Purpose of Study • Examine Personalized Instructor Feedback on a Problem-Based Learning Assignment • Comparison of Motivational Variables • The Study will Evaluate VS. Personalized Instructor Feedback (Kaltura Media + Written Feedback Only) Personalized Instructor Feedback (Written Feeback Only)

  32. Significance This research will continue to contribute to the development of pedagogy by investigating the impact of on-demand personalized video feedback as a way of enhancing the student learning experience in a problem-based online learning environment.

  33. Methodology • This research study was conducted with students in the Foundations of Marketing class (MKTG 7000) which is an introductory 7-week marketing class for the University of Cincinnati online MBA program. • Before the start of the semester, students were randomly assigned to Team Project Groups using the Blackboard random team assignment tool. • After groups were established, simple random sampling was used to alternately select test versus control groups. • Students were provided feedback on each of 3 submissions for a team-based marketing plan project. • Assignment feedback began on November 9th and was completed on December 5th.

  34. Methodology • Once final feedback was provided, a Blackboard Announcement was posted giving all students two choices for 2.5 extra credit points*; • Complete a 7-minute survey about the class; or • Submit a one-page review of a current events marketing article. • There was a 73% response rate on the survey. *These options were also presented in the syllabus for the course available at the beginning of the semester.

  35. Sample Experiment: Control (Non-Intervention Condition) • Students were provided feedback on each of 3 submissions for a team-based marketing plan project. • Feedback was provided through the Blackboard on-line grading system utilizing a grading rubric developed for each section of the assignment. • Written comments were provided as needed using a standardized format to ensure consistency and objectivity in the feedback process. • The control condition was consistent with the process used in providing feedback to students in the same course over the last 3 semesters.

  36. Sample Experiment: Test (Intervention Condition) • Students received the same type of written feedback as the control group. • Additionally students received feedback utilizing a video recording tool (Kaltura) available through Blackboard. These videos were provided to the student teams through Blackboard and were available on-demand at the same time written feedback was provided. • The video feedback provided no additional feedback content. It was simply the instructor reviewing the written feedback using screen sharing and webcam capture. Video Feedback Written Feedback

  37. Assessment • Evaluated existing measures • RAIQDC • CLASSE • SLEQ • OSE • Determined the research need was content-dependent and as a result we developed a proprietary scale to assess results based on the objectives of our study. • Incorporated content from existing measures and scales.

  38. Next Steps • Analyze results (1st Quarter 2016) • Prepare and deliver final report (March, 2016) • Results will be available at www.FacultyeCommons.com

  39. Today’s Speakers Bradley Davis, PhD University of Texas at Arlington bwdavis@uta.edu Karen Manning University of Cincinnati manninkn@ucmail.uc.edu Brinda McKinney, PhD Arkansas State University bmckinney@astate.edu Angela Robbins University of Cincinnati robbinae@ucmail.uc.edu December 15, 2015

  40. Thank you for your time. We hoped you learned something that helps you. Please contact norma.hansen@academicpartnerships.com or the presenters directly with any questions.

More Related