Directions Respond To The Case Study Below Using The Soap Fo
Directions Respond To The Case Study Below Using The Soap Format
Respond to the Case Study below using the S.O.A.P. format for 10 possible points. Two points for each category. Remember to include safety, the date, time, age, weather, gender. You have been hiking all day around Sandia Peak in New Mexico on a warm day in the summer. As you are about to descend down the mountain in the tramway around 4pm, you notice an older male in his 60's who seems to be having breathing problems.
As you approach, he complains of shortness of breath, a cough, and chest pain. He is able to speak only a few words at a time and says, "I can't.....catch...my breath." Additional questioning reveals that the man is from out of state and has asthma. He appears to be in considerable distress. How do you respond? Chapter 13: Evaluation: An Overview © 2017 Pearson Education, Inc.
Chapter 13 Lecture 1 Background Information on Evaluation Adequate and appropriate evaluation is necessary for any program regardless of size, nature, and duration. Two critical purposes of program evaluation are Assessing and improving quality Determining program effectiveness Conducting evaluation and research is a major area of responsibility for health education specialists. © 2017 Pearson Education, Inc. Basic Terminology – 1 Evaluation The process of determining the value or worth of a health promotion program or any of its components based on predetermined criteria or standards of acceptability identified by stakeholders © 2017 Pearson Education, Inc. Basic Terminology – 2 Formative Evaluation Purpose is to improve the overall quality of a program or any of its components before it is too late (i.e., the program concludes) Attempts to enhance program components before and during implementation Process Evaluation Assesses the implementation process in general, and tracks and measures what went well and what went poorly and how these factors contributed to the success or failure of a particular program Measures the degree to which the program was successfully implemented and generally applies lessons learned in subsequent versions or implementations of the program Formative and process evaluations are often used interchangeably and have become somewhat synonymous. © 2017 Pearson Education, Inc. Basic Terminology – 3 Summative Evaluation Purpose is to assess the effectiveness of the intervention and the extent to which awareness, attitudes, knowledge, behavior, the environment, or health status changed as a result of a particular program An umbrella term Impact Evaluation Focuses on intermediary measures such as behavior change or changes in attitudes, knowledge, and awareness Outcome Evaluation Measures the degree to which end points such as diseases or injuries actually decreased Impact and outcome evaluations together constitute summative evaluation. © 2017 Pearson Education, Inc. Comparison of Evaluation Terms © 2017 Pearson Education, Inc. Start of Implementation Process Impact Outcome Formative Summative Planning End of Implementation Purpose of Evaluation To determine achievement of objectives related to improved health status To improve program implementation To provide accountability to funders, the community, and other stakeholders To increase community support for initiatives To contribute to the scientific base for community public health interventions To inform policy decisions © 2017 Pearson Education, Inc. (Capwell et al., 2000) Framework for Program Evaluation – 1 © 2017 Pearson Education, Inc. Framework for Program Evaluation – 2 Step 1 – Engaging Stakeholders Who are the stakeholders? Those involved in program operations Those served or affected (directly or indirectly) by the program Primary users of the evaluation results The scope and level of stakeholder involvement will vary with each program being evaluated. Step 2 – Describing the Program Sets the frame of reference for all subsequent decisions in the evaluation process Describes mission, goals, objectives, capacity to affect change, stage of development, and how it fits into the larger community Logic model can be used © 2017 Pearson Education, Inc. Framework for Program Evaluation – 3 Step 3 – Focusing the Evaluation Design Makes sure the interests of stakeholders are addressed Identifies reason of evaluation, how it will be used, questions to be asked, design of evaluation, and finalizes any agreements about the process Step 4 – Gathering Credible Evidence Decides on measurement indicators, sources of evidence, quality and quantity of evidence, and logistics for collecting evidence Organizes data including specific processes related to coding, filing, and cleaning © 2017 Pearson Education, Inc. Framework for Program Evaluation – 4 Step 5 – Justifying Conclusions Comparing the evidence against the standards of acceptability Judging the worth, merit, or significance of the program Creating recommendations for actions based on results Step 6 – Ensuring Use and Sharing Lessons Learned Use and dissemination of the results Needs of each group of stakeholders addressed Four standards of evaluation: Utility standards Feasibility standards Propriety standards Accuracy standards © 2017 Pearson Education, Inc. Practical Problems or Barriers in Evaluation – 1 Planners either fail to build evaluation in the planning process or do so too late. Adequate resources may not be available to conduct an appropriate evaluation.
Organizational restrictions may prevent hiring consultants and contractors. Effects are often hard to detect because changes are sometimes small, come slowly, or do not last. Length of time allotted for the program and its evaluation is not realistic. Restrictions may limit the collection of data among the priority population. © 2017 Pearson Education, Inc. Practical Problems or Barriers in Evaluation – 2 It is difficult to make an association between cause and effect.
It is difficult to evaluate multi-strategy interventions. Discrepancies between professional standards and actual practice exist with regard to appropriate evaluation design. Evaluators’ motives to demonstrate success introduce bias. Stakeholders’ perceptions of the evaluation’s value may vary too drastically. Intervention strategies are sometimes not delivered as intended or are not culturally specific. © 2017 Pearson Education, Inc. Evaluation in the Program Planning Stages Evaluation design must reflect the goals and objectives of the program. The evaluation must be planned in the early stages of development and be in place before the program begins. Baseline data – those reflecting the initial status or interests of the participants; from a needs assessment Initial data regarding the program should be analyzed promptly to make any necessary adjustments to the program. By creating the summative evaluation early in the planning process, planners can ensure that the results are less biased. © 2017 Pearson Education, Inc. Ethical Considerations Evaluation or research should never cause mental, emotional, or physical harm to those in the priority population.
Participants should always be informed of the purpose and potential risks and should give consent. No individual should ever have his or her personal information revealed in any setting or circumstance. When appropriate, evaluation plans should be approved by institutional review boards (IRBs). © 2017 Pearson Education, Inc. Who Will Conduct the Evaluation? – 1 Internal Evaluation An individual trained in evaluation and personally involved with the program conducts the evaluation. Advantages More familiar with organization and program history Knows decision-making style of those in the organization Present to remind people of results now and in the future Able to communicate results more frequently and clearly Less expensive Disadvantages Possibility of evaluator bias or conflict of interest © 2017 Pearson Education, Inc. Who Will Conduct the Evaluation? – 2 External Evaluation Conducted by someone who is not connected with the program (Evaluation consultant) Advantages More objective review and fresh perspective Can ensure unbiased evaluation outcome Brings global knowledge of working in a variety of settings Typically brings more breadth and depth of technical expertise Disadvantages More expensive Can be somewhat isolated, often lacking knowledge of and experience with the program Evaluator should be credible and objective, have a clear role in evaluation design, and accurately report findings. © 2017 Pearson Education, Inc. Evaluation Results Who will receive the results of the evaluation? Different aspects of the evaluation can be stressed, depending on the group’s particular needs and interests. Different stakeholders may want different questions answered. The planning for the evaluation should include a determination of how the results will be used. © 2017 Pearson Education, Inc. As we begin to understand evaluation and the importance of assessment in program improvement, it is crucial that we have a context for this understanding to be thorough. Please "find" a program that has been evaluated already. You will certainly be able to find a peer-reviewed article titled something like, "Evaluation of a Program that ...." You can choose a program that you already know about, one that you may have participated in or one that you simply find on the internet that intrigues you. We will look at several, so don't think it is just a one for all assignment. We are going to review the article, and summarize the following points, and then submit: 1. Bibliographic citation of the program that you reviewed. 2. Brief overview of the program itself. 3. What type of evaluation was done (process, impact, or outcome)? 4. How was the evaluation done? 5. When was it evaluated (Before, during, immediately after, after)? 6. Who was evaluated? 7. Were the Objectives reached? 8. Was the goal reached? 9. How was the program changed as a result of the evaluation? 10. What questions remain about program evaluation after reading the article?