Parent
Groups

 

Database of Groups

  Resources for Groups
  Training & Assistance
  Links
 

 

 

 

Using Evaluation to Learn What Works and Improve Your Program

by Vanessa McKendall of Face Valu (NACAC's evaluation consultant)

After interacting with many group leaders, we understand that many parent groups miss the opportunity to use evaluation as a tool for learning. While the most frequently evaluated activity is a specific event, many groups do not have a way to collect information about the effectiveness of their work. This short discussion is designed to encourage you to think about ways to use evaluation to help you make decisions about your programís effectiveness.

Evaluation is a way to learn what works and what needs to be improved about our program. This valuable information can help us make decisions about what actions to take and how to use our resources to help meet overall goals. Needed improvements or additions to your program will be easy to target when you have a systematic way to collect information. For example, being able to use data to talk about who is involved with your program not only helps you make key decisions about program improvement, but can also assist you when applying for grants or requesting donations. The information will help you demonstrate the points to make in your proposals. Most funding sources will want to know how you plan to evaluate the success of the project for which you are seeking funding.

Evaluation Plan

There are many ways to construct an evaluation plan and you may already have something that works for you. We will share a format you might find useful. We will start by providing a definition of evaluation and an overview of the evaluation process in this section. In the last section, we will suggest ways for you to get started.

Michael Q. Patton, a nationally and internationally known evaluator, defines evaluation as: "The systematic collection of information about activities, characteristics, and outcomes of programs for use by specific people to reduce uncertainties, improve effectiveness, and make decisions with regard to what those programs are doing and affecting." (Michael Quinn Patton, Utilization-Focused Evaluation, 1986.)

Patton makes two points that are important to emphasize:

Evaluation should be systematic

There is a logic we use when we plan for and organize evaluation. For example, an evaluation process usually includes the following steps:

  • Planning the evaluation: We identify what we want to learn from an evaluation and construct a plan to get the work done.

  • Collecting the information: We use various methods that include both quantitative (primarily numbers) and qualitative (primarily words) methods. They include interviews, focus groups, checklists, tracking forms, surveys, observations, document review, tests, and others.

  • Analyzing the information: After we have done our data collection, we organize and study the data to learn what they say.

  • Interpreting the information: This step requires us to reflect on what the data mean for our context and situations. We should do this with others who work with our program or who are interested in our results.

  • Sharing the information: In addition to those most directly connected with our program, many people, including funders, participants, and others, do similar work and will be interested in what we learned.

Evaluation should be useful

There is value in evaluation! We have to work to make sure we are asking important questions that will help us learn more about our work, reduce uncertainties and help us figure out how to be more effective.

We already mentioned that evaluation is a systematic process in which we think carefully about our program and what we would like to learn. Next we need to set up our evaluation plan. Here you begin thinking about what you want to know after you have finished the evaluation process. This helps determine a set of questions and the best ways to answer those questions. Thinking about evaluation and planning for it with others will help you with this process. Go through the following questions and decide how you would respond.

  1. What do you want to learn from this evaluation process? What will be helpful for you to know as you continue to think about how to improve your program?

    • What are you required to discuss in progress and final reports to funders?
    • Based on what you want to learn, what are three or four key questions that you want to be able to answer once the data collection is complete? These may be the questions that you will focus on for the next year. You can always modify them in the future.

  2. Look at each question. What type of information do you need to collect to be able to answer the questions? This could include both numbers and words (quantitative and qualitative).

  3. Once you have decided on the information you need, make a list of the persons or documents that can provide the information.

  4. Look at your sources of information and decide on the best method to collect or learn the information. You canít do it all, so think about the most important information you need and determine the best sources.

  5. Decide who will collect the information and when the data will be collected. Writing this down helps you make a commitment to get it done.

  6. Think about how you will organize yourself to review the information you collect and interpret its meaning for your program.

  7. Decide how to share what you learned with others. Many funders require a specific report format. Think about how to make it useful for you and your immediate audiences.

Points to Remember

  • Identify who in your group will make evaluation a priority. Work together with your board, staff, or core group. Collaboratively create, write, and communicate the plan. Decide together what information you want to collect, who will do it, and what you will do with it once it is collected.

  • Get in the habit of documenting the work you do. Write things down! Take pictures! Short notes are better than blank paper, especially after time passes and youíre trying to reconstruct what happened. Have a special place to collect your notes until you are ready to use them.

  • Create useful tools to help you systematically document your observations.

  • Find consistent ways to talk about what you are learning. For some this could be regular, organized meetings with an evaluation group. For others it might mean making a commitment to look at your information once a quarter with at least one other person. You will also want to write down your reflections and decisions.

  • Revise your evaluation plan based on your experiences. Changes should be clearly recorded and explained.

  • Keep a physical file or three-ring binder as well as a computer file for all your evaluation documents.

Does it seem like too much for your small group? Then start small. Make it reasonable, doable, and useful. Choose one or two evaluation activities to try. For example, try tracking participant involvement. Using some basic tools to track and describe who participates in your program and activities is important. Sample tracking documents are enclosed for your convenience. These forms may be copied and used as they are or you may want to customize them to meet your needs. The choice is yours. If you are already tracking your phone contacts and meeting participants, perhaps you might be interested in conducting a parent survey. We have developed two different survey forms you could use or change as well as instructions for them. These may be obtained for free from NACAC by contacting Diane Martin-Hushman at hushman@nacac.org or 651-644-3036.


North American Council on Adoptable Children (NACAC)
970 Raymond Avenue, Suite 106
St. Paul, MN 55114
phone: 651-644-3036
fax: 651-644-9848
e-mail: info@nacac.org
Feedback