Create A 5- Or 6-Slide Narrated PowerPoint Presentation ✓ Solved

Create a 5- or 6-slide narrated PowerPoint that presents a c

Create a 5- or 6-slide narrated PowerPoint that presents a comprehensive Change Implementation and Management Plan for Sturgeon Point Productions, an organization that produces and markets educational video content and provides interactive web features (historic interactive timelines, set photos, interactive educational games, fast facts and flashcards, quizzes and exams). Develop a Change Implementation and Management Plan that includes: an executive summary of issues affecting the organization; a description of proposed changes and justifications for them; details about the type and scope of proposed changes; identification of stakeholders impacted; identification of a change management team by title/role; a communication plan for the changes; and risk mitigation plans. Produce a 5–6 minute narrated presentation (5–6 slides) and a written plan.

Paper For Above Instructions

Executive Summary

Sturgeon Point Productions is an educational video producer that supplements each title with interactive web features (timelines, set photos, games, flashcards, quizzes). Current challenges include inconsistent user experience across titles, low interactive-content adoption among instructors, limited analytics to measure learner outcomes, and a fragmented internship onboarding process that undermines workforce readiness. This plan proposes a phased change program to standardize the web UX, expand pedagogical interactive features, integrate learning analytics, and formalize internship workflows. These interventions will increase engagement, measurable learning outcomes, and operational efficiency (Kotter, 1996; Hiatt, 2006).

Proposed Changes

1) Establish a standardized UX/UI framework and content template for all academic video pages to ensure consistent navigation and responsive design. 2) Expand interactive learning modules (microgames, adaptive quizzes, flashcard decks) tied to learning objectives in each video. 3) Deploy a learning analytics dashboard to capture engagement, assessment scores, and completion rates. 4) Formalize the internship program with clear role descriptions, prerequisite checklists, and mentor assignments integrated into the website. 5) Introduce A/B testing and iterative UX research to continuously refine features (Nielsen, 2012; Garrett, 2010).

Justification and Expected Impact

Improved UX consistency reduces cognitive load and abandonment (Norman, 2013; ISO 9241-11, 1998). Pedagogically-aligned interactive modules raise retention and transfer (Mayer, 2009; Clark & Mayer, 2016). Analytics will enable data-driven content improvements and demonstrate value to institutions, increasing renewals and licensing revenues (Prosci, 2020). A structured internship program enhances talent development and produces higher-quality production support, lowering turnover and training costs (Fernandez & Rainey, 2006). Overall, these changes align with best practices for learner-centered digital design and organizational change management (Kotter, 1996; Hiatt, 2006).

Type and Scope of Changes

Type: Organizational (processes and roles), Technical (UX templates, analytics), and Educational (instructional design of interactives). Scope: Company-wide for all academic video pages over a 9–12 month phased rollout. Phase 1 (Months 0–3): UX/UI standards, pilot analytics on 3 titles, and internship workflow redesign. Phase 2 (Months 4–8): Scale templated pages and interactive modules to 50% of catalog, full analytics deployment, mentor program launch. Phase 3 (Months 9–12): Complete rollout, A/B testing, training for staff and partner instructors, and program evaluation.

Stakeholders Impacted

- Internal: Web production team, instructional designers, editors, producers, HR/internship coordinators, IT and analytics staff, executive leadership. External: university instructors, students, licensing partners, and accreditation stakeholders. Each group experiences distinct benefits and transition needs; students and instructors gain better learning tools, while internal staff require training and role clarity (Klein & Knight, 2005).

Change Management Team (by title/role)

  • Change Sponsor: Chief Content Officer (executive backing and resource allocation)
  • Program Lead: Director of Web Production (day-to-day program management)
  • Instructional Design Lead: Senior Instructional Designer (learning-alignment of interactives)
  • UX Lead: Lead Front-End Developer / UX Designer (templates, accessibility)
  • Analytics Lead: Data Analyst (dashboard design, metrics)
  • Internship Coordinator: HR Specialist (onboarding, mentoring)
  • Communications Lead: Marketing Manager (internal/external messaging)
  • Quality Assurance: QA Lead (testing, accessibility compliance)

Communication Plan

Clear, frequent communication is critical. The Communications Lead will implement a communication cadence: weekly sprint updates to internal teams, monthly executive status reports, and bi-monthly stakeholder newsletters for instructors/partners. Launch-specific activities include webinars for instructors demonstrating new tools, a public release blog post, and short tutorial videos embedded on each title page. Internal training sessions and an FAQ hub will address staff questions and reduce resistance (Kotter, 1996; Prosci, 2020).

Risk Mitigation Plans

Key anticipated risks and mitigations:

  • Technical integration failures — mitigate with phased pilots, API contract tests, and rollback plans; reserve budget for contingency (DevOps best practice).
  • User resistance or low adoption — mitigate with instructor co-design sessions, early adopters program, and in-platform onboarding nudges (Mayer, 2009).
  • Analytics privacy and compliance risks — mitigate with data minimization, consent mechanisms, and legal review (GDPR/FERPA considerations where applicable).
  • Resource constraints — mitigate by prioritizing high-impact titles first and leveraging interns under mentor supervision for content tagging and QA.
  • Project scope creep — mitigate via a strict product roadmap, change requests governance, and monthly steering committee reviews (Hiatt, 2006).

Evaluation and Success Metrics

Measure success through: increase in average session duration on academic pages, completion rates of interactive modules, improvements in quiz scores, instructor adoption rate (% of partners using interactive features), internship placement satisfaction, and reduction in content production turnaround time. Quarterly reviews will compare baselines to targets and guide iterative improvements (Sitzmann, 2011; Prosci, 2020).

Conclusion

This change plan balances technical upgrades, pedagogy, and human factors to boost the instructional value and usability of Sturgeon Point Productions’ digital offerings while strengthening its internship pipeline. A phased approach with executive sponsorship, defined roles, continuous communication, measurable KPIs, and pragmatic risk controls will facilitate sustainable adoption and demonstrable learning outcomes (Kotter, 1996; Hiatt, 2006).

References

  • Kotter, J. P. (1996). Leading Change. Harvard Business School Press.
  • Hiatt, J. (2006). ADKAR: A Model for Change in Business, Government and our Community. Prosci Learning Center Publications.
  • Prosci. (2020). Best Practices in Change Management. Prosci, Inc. Retrieved from https://www.prosci.com
  • Nielsen, J. (2012). Usability 101: Introduction to Usability. Nielsen Norman Group. Retrieved from https://www.nngroup.com
  • Norman, D. A. (2013). The Design of Everyday Things: Revised and Expanded Edition. Basic Books.
  • Garrett, J. J. (2010). The Elements of User Experience: User-Centered Design for the Web and Beyond. New Riders.
  • Mayer, R. E. (2009). Multimedia Learning (2nd ed.). Cambridge University Press.
  • Clark, R. C., & Mayer, R. E. (2016). E-Learning and the Science of Instruction: Proven Guidelines for Consumers and Designers of Multimedia Learning. John Wiley & Sons.
  • Fernandez, S., & Rainey, H. G. (2006). Managing successful organizational change in the public sector. Public Administration Review, 66(2), 168–176.
  • Sitzmann, T. (2011). A meta-analytic examination of the instructional effectiveness of computer-based simulation games. Personnel Psychology, 64(2), 489–528.