As a human services administrator, you need to determine which program evaluations are most useful. Much depends on the type of organization and the nature of the services being evaluated. Perhaps you need to know if a program is working efficiently or reaching its intended target community. You might consider using a program evaluation to plan for the future of an organization.
Not all program evaluations are equal. Some identify problems in the organization that require attention, while others may fail to provide useful information. It is important to note strengths and limitations of program evaluations so that you might select the evaluations that are most useful.
For this Assignment, select one of the program evaluations samples from the list provided in this week’s Resources and consider its strengths and limitations. You will create a short presentation (7–10 slides) on the strengths and weaknesses. As a Walden student, you have a Google email account (Gmail) and access to Google tools. You can find the tools when you log into your account. For this presentation, you can use Google Slides or PowerPoint. If you are new to Google Slides, you can find resources in this week’s Learning Resources to get you started.
In a 7- to 10-slide presentation, you should provide:
Support your Assignment with specific references to all resources used in its preparation. You are asked to provide a reference list for all resources, including those in the Learning Resources for this course. You should include in your references at least two resources included in this week’s resources and at least one outside scholarly resource.
Program eval resources
© 2016 Laureate Education, Inc. Page 1 of 1 Week 3: Program Evaluation Samples Note: You are not expected to read the entire program evaluation you select. Instead, review the summary or conclusions area to gather the information you need for the assignment. • Magill, K., Hallberg, K., Hinojosa, T., & Reeves, C. (2010). Evaluation of the implementation of the rural and low-income school program: Final report. Office of Planning, Evaluation and Policy Development, U.S. Department of Education. Retrieved from the Walden Library using the ERIC database. • Kingsbury, N. (2011). Program evaluation: Experienced agencies follow a similar model for prioritizing research. Report to the subcommittee on oversight of government management, the federal workforce, and the District of Columbia, committee on homeland security and governmental affairs, U.S. Senate. GAO11-176. U.S. Government Accountability Office. Retrieved from the Walden Library using the ERIC database. • Sanders, J. R., & Nafziger, D. N. (2011). A basis for determining the adequacy of evaluation designs. Journal of Multidisciplinary Evaluation, 7(15), 44–78. Retrieved from the Walden Library using the Directory of Open Access Journals database. • Pereira, N., Peters, S. J., & Gentry, M. (2010). My class activities instrument as used in Saturday enrichment program evaluation. Journal of Advanced Academics, 21(4), 568–593. Retrieved from the Walden Library using the Academic Search Complete database. • Piper, B., & Korda, M. (2011). EGRA plus: Liberia. Program evaluation report. RTI International. Retrieved from the Walden Library using the ERIC database. • Gaubert, J. M., Knox, V., Alderson, D. P., Dalton, C., Fletcher, K., & McCormick, M. (2010). The supporting healthy marriage evaluation: Early lessons from the implementation of a relationship and marriage skills program for low-income married couples. MDRC. Retrieved from the Walden Library using the ERIC database. • Curry, S. J., Mermelstein, R. J., Sporer, A. K., Emery, S. L., Berbaum, M. L., Campbell, R. T., & … Warnecke, R. B. (2010). A national evaluation of community-based youth cessation programs: Design and implementation. Evaluation Review, 34(6), 487–512. Retrieved from the Walden Library using the Sage Premier 2010 database.
Develop an evaluation plan to ensure your program evaluations are carried out efficiently in the future. Note that bankers or funders may want or benefit from a copy of this plan.
Ensure your evaluation plan is documented so you can regularly and efficiently carry out your evaluation activities. Record enough information in the plan so that someone outside of the organization can understand what you’re evaluating and how. Consider the following format for your report:
1. Title Page (name of the organization that is being, or has a product/service/program that is being, evaluated; date)
2. Table of Contents
3. Executive Summary (one-page, concise overview of findings and recommendations)
4. Purpose of the Report (what type of evaluation(s) was conducted, what decisions are being aided by the findings of the evaluation, who is making the decision, etc.)
5. Background About Organization and Product/Service/Program that is being evaluated
a) Organization Description/History
b) Product/Service/Program Description (that is being evaluated)
i) Problem Statement (in the case of nonprofits, description of the community need that is being met by the product/service/program)
ii) Overall Goal(s) of Product/Service/Program
iii) Outcomes (or client/customer impacts) and Performance Measures (that can be measured as indicators toward the outcomes)
iv) Activities/Technologies of the Product/Service/Program (general description of how the product/service/program is developed and delivered)
v) Staffing (description of the number of personnel and roles in the organization that are relevant to developing and delivering the product/service/program)
6) Overall Evaluation Goals (eg, what questions are being answered by the evaluation)
a) Types of data/information that were collected
b) How data/information were collected (what instruments were used, etc.)
c) How data/information were analyzed
d) Limitations of the evaluation (eg, cautions about findings/conclusions and how to use the findings/conclusions, etc.)
8) Interpretations and Conclusions (from analysis of the data/information)
9) Recommendations (regarding the decisions that must be made about the product/service/program)
Appendices: content of the appendices depends on the goals of the evaluation report, eg.:
a) Instruments used to collect data/information
b) Data, eg, in tabular format, etc.
c) Testimonials, comments made by users of the product/service/program
d) Case studies of users of the product/service/program
e) Any related literature
1. Don’t balk at evaluation because it seems far too “scientific.” It’s not. Usually the first 20% of effort will generate the first 80% of the plan, and this is far better than nothing.
2. There is no “perfect” evaluation design. Don’t worry about the plan being perfect. It’s far more important to do something, than to wait until every last detail has been tested.
3. Work hard to include some interviews in your evaluation methods. Questionnaires don’t capture “the story,” and the story is usually the most powerful depiction of the benefits of your services.
4. Don’t interview just the successes. You’ll learn a great deal about the program by understanding its failures, dropouts, etc.
5. Don’t throw away evaluation results once a report has been generated. Results don’t take up much room, and they can provide precious information later when trying to understand changes in the program.