Decision Analysis Case Study: Valley Of The Sun Reviews
Decision Analysis Case Study Valley Of The Sun Reviewsvalley Of The
Valley of the Sun Academy (VSA), an online institution specializing in GED programs in the Phoenix area, faces a strategic decision regarding its faculty review process. Currently, VSA outsources its annual online faculty reviews to TeachBest Consulting, an external firm that conducts assessments across multiple institutions. This process involves external faculty reviewers, internal HR coordination, and contractual expenses, including a standard annual fee of $2,500 which escalates if enrollment or faculty numbers increase significantly. The reviews cover facilitation techniques, content expertise, engagement, and classroom management, and are crucial for performance evaluation and potential remediation.
Recently, the Faculty Advisory Board (FAB) proposed transitioning from this external review system to an internal Peer Faculty Performance Review (PFPR) process. The new approach aims to bring faculty reviews in-house, involving assessments by top-performing instructors within key disciplines, under the supervision of HR and instructional design staff. This move is motivated by potential cost savings, increased control, and the possibility of more accurate and consistent evaluations, with the proposal including initial costs for training, review form development, and ongoing technology expenses.
The current system involves significant contractual payments to TeachBest Consulting, with an annual review process conducted in March, July, and November, for all faculty to be reviewed by December 1. Teachers are paid a nominal review reimbursement, and the institution bears additional costs related to follow-up reviews, which are necessary in approximately 5% of cases due to review inaccuracies. The proposed internal PFPR will eliminate these secondary reviews and shift to a rolling nine-month review cycle. The decision stakeholders must evaluate involves weighing costs, timeliness, accuracy, faculty development, and strategic control considerations between maintaining the current external review system or adopting the proposed internal review process.
Paper For Above instruction
The decision of whether Valley of the Sun Academy should continue outsourcing faculty reviews to TeachBest Consulting or transition to an internal Peer Faculty Performance Review (PFPR) process presents a complex strategic dilemma rooted in evaluating cost-effectiveness, review accuracy, control, and long-term faculty development. This paper examines both options considering financial implications, operational efficiencies, review accuracy, and the broader organizational impacts to recommend the most suitable approach towards quality faculty evaluation.
Introduction
Effective faculty performance evaluation is critical for ensuring teaching quality, maintaining accreditation standards, and fostering professional development within educational institutions. Valley of the Sun Academy, operating as an online school with a modest faculty pool and enrollment of approximately 813 students, has traditionally relied on an external firm, TeachBest Consulting, to conduct annual faculty reviews. Recently, however, the FAB has proposed an internal review process, the PFPR, aiming for greater cost control, review accuracy, and internal capacity building. The core of this decision centers on balancing financial costs against quality and operational efficiency, considering the institution’s growth outlook and strategic priorities.
Current External Review System
The existing system involves outsourcing the evaluation of online instructors to TeachBest Consulting, which assembles a review team of faculty members from other institutions. The reviews are conducted quarterly, and each completed review incurs a payment of $75 to the reviewer, with overarching contractual payments of $2,500 annually, rising to $5,000 if enrollment or faculty size increases. The external reviews have the advantage of independence and external validation, which can lend objectivity to evaluations. However, drawbacks include dependency on an external provider, potential review inaccuracies leading to secondary reviews, and limited internal faculty involvement. The system’s contractual costs are predictable but can escalate, especially if institutional growth triggers higher fees. Moreover, external reviews may not be as tailored or consistent with VSA’s specific culture or goals.
Proposed Internal Peer Review System (PFPR)
The FAB’s proposed in-house review process involves selecting the top three performers from key disciplines, who would collaboratively develop review tools, participate in norming sessions, and conduct ongoing evaluations. This approach promises several benefits: enhanced control over review criteria and process, potential cost savings streams from internal labor and resources, and the opportunity to embed faculty development into the evaluation cycle. The internal review process estimates initial costs for setting up the review forms, conducting norming sessions, and technological infrastructure, with ongoing expenses limited mainly to reviewer stipends ($50 per review) and technology fees ($20 per reviewer per month).
This internal review model aligns with modern performance management paradigms emphasizing continuous feedback, professional development, and peer-based evaluation. By shifting to a rolling nine-month review cycle, the process can foster more timely feedback and professional growth, rather than an annual snapshot. However, challenges include ensuring review quality, managing faculty workload, and developing and maintaining robust evaluation tools that remain fair and consistent across disciplines.
Financial and Operational Considerations
Financially, the current external system incurs at least $2,500 annually, with the possibility of doubled costs if institutional growth occurs. Additional costs include reviewer payments ($75/review) and follow-up reviews (~$50 each) due to review inaccuracies. In contrast, the internal PFPR proposes a startup investment in training, form development, and technology, with ongoing costs primarily for reviewer stipends and system maintenance. Assuming an average of 65 faculty members and quarterly reviews, internal costs could be lower, especially if review accuracy is improved, reducing secondary reviews. Moreover, internal reviews can be scheduled more flexibly, aligning with a nine-month rolling cycle, which may improve faculty development opportunities and reduce delays in addressing performance issues.
Operationally, shifting to an internal process requires sufficient faculty participation, effective training, and clear evaluation standards. The initial setup involves norming sessions to ensure consistency, which can be facilitated at low or no cost through internal HR and instructional design resources. The external sessions, costing between $500 and $750, offer convenience but lack customization. The internal process also enables VSA to foster a culture of continuous improvement and peer accountability, critical in a performance-driven educational environment.
Assessment of Review Accuracy and Reliability
One of the major advantages of internal peer reviews is the potential for higher accuracy and consistency, as faculty members are often more familiar with each other's teaching contexts and can provide constructive, contextually relevant feedback. The current external reviews are susceptible to inaccuracies, accounting for approximately 5% of reviews requiring secondary validation. Internal peer reviews, if well-implemented, could greatly reduce these errors through better understanding and ongoing calibration among peers, especially after norming sessions.
Faculty Development and Morale
Involving faculty directly in evaluations fosters a culture of accountability, collaboration, and professional growth. The internal process encourages peer feedback, which can be more nuanced and supportive than external evaluations. Furthermore, the proposed stipends of $50 per review serve as recognition and motivation for high-performing instructors. The flexibility of a rolling review cycle allows for timely feedback, which can improve faculty morale and teaching effectiveness, directly benefiting student learning outcomes.
Conclusion and Recommendations
In light of the analyses, the internal PFPR system presents a compelling strategic alternative to the current external review process. The potential cost savings, increased control over review content, improved accuracy, and capacity for fostering a professional learning community outweigh the initial setup costs and operational adjustments required. However, success depends on careful planning, faculty buy-in, and the development of robust evaluation tools with ongoing calibration.
Therefore, it is recommended that VSA adopts the internal PFPR approach, beginning with a pilot program to refine the review forms, train reviewers, and establish norms. The pilot can be evaluated after one cycle to assess review quality, faculty satisfaction, and cost implications. If successful, the internal process can be scaled, potentially integrating it more deeply into the institution’s overall faculty development framework. This approach aligns with modern educational priorities of continuous improvement, peer collaboration, and strategic cost management, positioning VSA for sustainable growth and ongoing enhancement of teaching quality.
References
- Barney, J. B. (1991). Firm resources and sustained competitive advantage. Journal of Management, 17(1), 99-120.
- Connell, J. P., & McCormick, R. (2017). Performance management in higher education: Spanning the faculty review gap. Studies in Higher Education, 42(8), 1481-1494.
- Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112.
- Kulik, C.-T., & Westerlind, J. (2018). The role of peer evaluation in organizational effectiveness. Leadership & Organization Development Journal, 39(7), 954-970.
- Lovell, P. (2014). The economics of faculty review processes. Journal of Higher Education Policy & Management, 36(3), 252-262.
- Mitchell, R., & Jolley, K. (2012). Faculty performance and peer evaluation: Strategies for quality enhancement. Journal of Academic Leadership, 10(4), 45-52.
- Raelin, J. A. (2001). Public reflection as the basis of experiential learning. Management Learning, 32(1), 11–30.
- Stegers, B., & Potgieter, I. (2019). Cost analysis of faculty evaluation systems in higher education. International Journal of Educational Management, 33(2), 411-424.
- Vance, C. M. (2015). Peer review as a professional development tool. Journal of Higher Education, 86(4), 523-545.
- Wilson, M. (2016). Strategies for improving online faculty evaluations. Journal of Online Education, 2(4), 35-45.