Getting to Outcomes: Step 07. Process Evaluation

This step provides guidance on what to include in, and how to gather data for, a process evaluation, which tells you how well you delivered the program.

What Is This Step?

This step involves evaluating how well the program ran: Did you implement the program according to your plan, and how well did it go? This step is called process evaluation because the collected data track the process of the program implementation, as opposed to the outcomes experienced by the participants (which are covered in GTO Step 8). Process evaluations typically track attendance of participants, program adherence, and how well you followed your work plan. They may also involve asking about satisfaction of program participants or about staff’s perception of how well the program was delivered. A process evaluation should be planned before the program begins and should continue while the program is running. Therefore, this step includes the Process Evaluation Planner Tool to plan for evaluation activities before the program starts and a Process Evaluation Summary Tool to interpret the results of your evaluation.

Why Is This Step Important?

The process evaluation tells you how well plans are being put into action and helps routinely and systematically monitor areas important to making the program a success. Examples include:

  • recruitment of participants
  • individuals’ attendance or exposure
  • program adherence
  • participant satisfaction
  • staff perceptions

The process evaluation will also tell you whether you need to make midcourse corrections (e.g., improve attendance because attendance is weak) or changes to your work plan for your next program implementation. Such data will provide you with information that may be useful to funders and help you better understand your program’s outcomes.

How Do I Carry Out This Step?

To do Step 7, complete the GTO Process Evaluation Planner Tool, carry out the data collection and analysis called for in your evaluation plan, complete the Process Evaluation Summary Tool, and consider changes needed to improve the program in the future based on your process evaluation results. If your program is run continuously, you will need to identify a time when you can make a change to how you run your program going forward. Changes for program improvement will be addressed in Step 9.

Tip 7-1. Types of process evaluation information

You are likely to use a variety of methods for collecting your process evaluation data. Here’s some additional information about a few key methods mentioned in this chapter.

Tip 7-1. Types of process evaluation information

Participant data

  • What it is: Specific information about participants, such as age, gender, race/ethnicity, education level, household income, family size, and referral source

  • How to gather it: You have probably already gathered much of this kind of information in the course of planning for, establishing, or running your program. Often, these types of questions are asked as part of an intake to a service or an outcome assessment survey. Information can be gathered during an interview with each participant as well.

  • Why it is important: It tells you whether your program is serving the targeted population and whether program outreach efforts are working to engage the participants you planned to reach.

Focus groups

  • What they are: Focus groups are facilitator-led discussions on a specific topic with a group of no more than 6–12 participants brought together to share their opinions on that topic.

  • How to manage them: Generally, focus groups are led by 1–2 facilitators who ask the group a limited number of questions. Think of the structure of a focus group like a funnel—each major topic should start with broad questions and then get more specific. Be sure to record the focus group or have a designated note-taker. The data can be analyzed by looking for the themes that appear in the transcripts or notes. The following resources provide more information on focus groups:

  • Why they’re important: Focus groups are an excellent way to learn what people thought about your program and get suggestions about your program. Focus groups often yield qualitative (i.e., text) data, as opposed to surveys, which usually yield quantitative (i.e., numerical) data. Listening as people share and compare their different points of view provides a wealth of information—not just about what they think but also why they think the way they do.

Satisfaction surveys

  • What they are: Information about whether the participants enjoyed the program, whether they got something out of it, and whether the program met their needs or expectations

  • How to do them: The easiest way is to administer brief paper or web-based surveys to participants as part of the program at the end of each session or activity. This is better than waiting until the end of the entire program because sometimes participants forget details from earlier sessions. However, the surveys should be administered so that respondents feel comfortable that their responses will be kept confidential (i.e., service providers do not administer and collect responses). Surveys can also be handed out at the end of a program with self-addressed, stamped envelopes so the participant can complete the survey and return it later. This method, however, adds expense (cost of postage), and often fewer surveys are returned. If you are using a packaged program, it may require you to use a certain questionnaire with the program participants. You could also implement a web-based survey and invite participants to complete it via email.

  • Why they’re important: They tell you whether the participants feel good about the program and can help you identify ways to improve participant satisfaction, which would be likely to improve retention in the program.

Staff perception data

  • What they are: Staff perceptions about what worked and didn’t work during the implementation of a program. You may also want to collect staff perceptions about training and supervision quality.

  • How to gather them: There are several methods for gathering data on staff perspectives, including

    • Focus groups
    • Surveys
    • Interviews.

    In addition to what we’ve already mentioned about focus groups, an interview can be a good way to get detailed information about program implementation. While interviews with staff involve a similar type of questioning as a focus group, in an interview you are talking with one person at a time.

    A program debriefing is a straightforward way for staff to quickly meet immediately after a program session has been conducted and answer two questions:

    1. What went well in the session?
    2. What didn’t go so well, and how can we improve it next time?
  • Why they’re important: Program staff are often in an excellent position to comment on how well a program is being implemented and may have ideas for improvement.

Program adherence monitoring

  • What it is: Systematically tracking how closely each intervention activity was implemented as laid out in your final work plan. This includes how much of a program was administered (“dose”) and whether it was administered according to the program developer’s intentions.

  • How to do it: If you are using a packaged program, check with those responsible for disseminating the program to see whether they have an adherence guide, and make sure to obtain the scoring criteria. If an adherence instrument does not come with the program materials or you have developed your own program, look at adherence guides from other programs and create your own.

  • Why it is important: The closer you can come to implementing a program as it was intended, the better chance you have of achieving your goals and outcomes.

Source: Adapted from Hannah, McCarthy, and Chinman, 2011.

Tools Used in This Step

Process Evaluation Planner

Process Evaluation Planner


  1. Make as many copies of the tool as necessary for you and your co-workers to complete this task.
  2. Assign a person responsible for collecting the instruments, forms, and questionnaires containing all the process information you will gather in the course of the program. The person who takes on this role needs to be especially organized and reliable.
  3. Your Logic Model Tool (GTO Step 2), Work Plan Tool (GTO Step 6), and manual or curriculum for your selected program will help you complete the tools in this step.
  4. Consider each process question listed (and any you wish to add), and note your measures and other considerations for data needed in the column labeled “Considerations.” For example, for Question 1, you might enter age and gender if these are the characteristics you are interested in.
  5. Enter the evaluation methods and data collection tools that you will use to address the following process evaluation questions:
    • Program participant characteristics, such as age and gender, can be gathered in the pre-survey or via attendance or sign-in sheets
    • Utilization by individual program participants can be calculated from your attendance rosters. Rosters should be designed to capture the percentage of time that participants attend each session or module (100 percent, 75 percent, 25 percent, etc.). Then you can also sum how many of the sessions each registered participant attended.
    • Level of delivery achieved by program may be determined by outside observers or program staff completing monitoring logs, checklists of required activities and core elements, or simple notes about the actual delivery, compared with the agenda or program guide. See the appendix for a copy of the ROAD-MAP participant satisfaction and demographics survey.
    • Participant satisfaction may be determined through participant focus group discussions, general observations, or a post-program evaluation survey that asks open-ended questions. Some programs have their own satisfaction surveys you can adapt. See the appendix for a copy of the ROAD-MAP participant satisfaction and demographics survey.
    • Staff perception can be determined by asking staff questions about what they believed to be the successes, challenges, and opportunities related to the program’s implementation.
    • Work plan adherence can be determined by reviewing the initial Step 6 Work Plan to see how closely it was followed. This could include tracking the timeliness of carrying out various tasks or the extent to which you served the number or type of expected participants.
  6. Enter the anticipated schedule for data collection and analysis (i.e., when the data will be collected and the frequency of collection) and when the results will be available.
  7. Enter the person(s) responsible for gathering and analyzing the data. For example, the program facilitator may take attendance, the facilitator’s supervisor may monitor adherence, and the supervisor may ask staff about their perceptions of the program.


  • Completed by: Project team/evaluator
  • Date: March
  • Program: ROAD-MAP
Process Evaluation Planner Tool (filled out for demonstration purposes)
Process Evaluation Questions Considerations Evaluation Methods and Data Collection Tools Anticipated Schedule for Data Collection and Analysis Person(s) Responsible
1. What were the characteristics of program participants compared with those of the target population? Age, gender, race/ethnicity The participants in the workshops will be asked to complete the ROAD-MAP Participant Survey. The survey includes demographic questions.
  • Collection: at each workshop
  • Analysis: after the last scheduled workshop
Volunteer program staff
2. What was the participants’ program utilization compared with the program plan? Comparison is how close attendance at ROAD-MAP workshops was to the target of 20 participants per workshop. Sign-in sheets
  • Collection: at the start of each workshop
  • Analysis: after the last scheduled workshop
Volunteer program staff collect data, and evaluator enters and analyzes data
3. What level of delivery did the program achieve, and did all planned components get delivered?
  • Adherence to curriculum content
  • Quality of communication Observer should provide feedback to facilitators after the first workshop

Observer should provide feedback to facilitators after their first workshop

The ROAD-MAP fidelity assessment tool will be used. An observer of the program will sit in on the workshop and complete this tool.
  • Collection: during each workshop
  • Analysis: after the last scheduled workshop
4. How satisfied were the participants?
  • General satisfaction with program
  • Satisfaction with information delivered (Is it interesting, informative, easy to understand, helpful?)
  • Satisfaction with pamphlets and other handouts (Are they helpful?)
Participants will be asked to complete the 1-page post-workshop ROAD-MAP Participant Survey. The survey includes questions about satisfaction with the program.
  • Collection: at the conclusion of each workshop
  • Analysis: after the last scheduled workshop
Volunteer program staff administers; evaluator enters and analyzes data
5. What was the staff’s (including volunteers) perception of the program?
  • Program implementation staff
  • Partnering site staff
Interviews with volunteers Debriefing meeting of all program staff
  • Collection: interviews within 1 week of each facilitator’s second workshop; meeting within one week of last workshop
  • Analysis: after the last scheduled workshop
6. How closely did the program follow the GTO Step 6 Work Plan?
  • Administrative tasks
  • Program policy procedures
  • Recruitment and retention tasks
  • Implementation planning tasks
  • Evaluation planning tasks
We will examine the GTO Step 6 Work Plan Tool to see whether the person in charge of each task accomplished it as planned and by the target due date. At debrief team meeting Evaluator
7. Other N/A

Process Evaluation Summary

Process Evaluation Summary


  1. Ask the person(s) you identified to collect and analyze the data in the Process Evaluation Planner Tool to provide the results for which they were responsible.
  2. Enter the results that answer the evaluation questions in the Process Evaluation Summary Tool. Be sure that the questions in the Process Evaluation Summary Tool are the ones you included in your Process Evaluation Planner Tool.
    • Program participant characteristics describe the demographics of the program participants from these data (e.g., number of participants, male or female, ethnicity, and age).
    • Utilization of program can be calculated from attendance information. You could calculate the percentage of participants who have perfect attendance (number with perfect attendance divided by all who participated), the overall attendance rate for the whole group (total number of sessions attended by all divided by total number of sessions the group could have attended), or the overall attendance for each session of the program (number of participants that attended session divided by total number of participants enrolled in program). If the program consists of only one session, calculate attendance as a percentage of the total anticipated or targeted.
    • Level of delivery achieved by program will depend on the measure you use. For example, you might calculate the percentage of activities fully completed, partially completed, and not at all completed for each session or component.
    • Participant satisfaction and staff perception of the process will also depend on the measure you are using. If using a measure that asks open-ended questions, look across the answers for general themes. If using a survey with defined answer choices, calculate averages or frequencies of the questions.
    • Work plan adherence describes the percentage of activities that you skipped or failed to deliver based on your work plan.


  • Completed by: Project team/evaluator
  • Date: August
  • Program: ROAD-MAP
Process Evaluation Summary Tool (filled out for demonstration purposes)
Process Evaluation Questions Process Evaluation Data & Results
1. What were the characteristics of program participants compared with those of the target population?

The target age range (65+) was 100% achieved. More than 70% of participants were female in all classes, which was a higher representation of the target 50/50 split but is representative of the community population.

We targeted certain underrepresented minorities, such as African Americans and Koreans, but the majority of respondents were non-Hispanic white or Latino across sites.

2. What was the participants’ program utilization compared with the program plan? Our target of 8 workshops was achieved. While there was variability between class sizes, the average per class (16 people) was close to our target of 20 participants per class.
3. What level of delivery did the program achieve, and did all planned components get delivered?

In all 8 workshops, the trainers achieved or excelled at delivering the majority of the program content, methods, and implementation using the teaching protocol

In 2 of the classes, some of the content modules were skipped due to time constraints.

4. How satisfied were the participants?

More than 90% of participants rated the information “very” informative and interesting, worth their time, and easy to understand.

96% of participants rated the provided materials as “very” helpful.

94% of participants indicated that they “very much” thought the information will help them make the right decisions.

95% of participants indicated that they definitely would recommend the program’s presentation to a loved one or friend.

5. What was the staff’s (including volunteers) perception of the program?

Peer volunteer workshop facilitators thought the program went well but that they could benefit from more training

Some peer volunteer workshop facilitators also wanted more time in the session to teach materials

The team was generally satisfied with the program but thought it could be provided in more languages and with greater organization from the presenters.

6. How closely did the program follow the GTO Step 6 Work Plan? All team members who were responsible for each task were on top of them, but there were several delays in due dates, such as for completing an MOU with the site and designing evaluation materials.
7. Other
8. Other

Step Checklist

When you finish working on this step, you should have:

  • Completed the Step 7 tools
  • Developed a clear process evaluation plan prior to launching your program, including a plan for
    • Tracking the number of participants and their attendance
    • Monitoring your program adherence
  • Determined how well you followed your work plan
  • Analyzed your process evaluation data after running your program
  • Developed a plan for making mid-course corrections if necessary

Before Moving On

Once you’ve finished your process evaluation plan, you are ready to move on to Step 8, in which you’ll plan your outcome evaluation to examine whether you are achieving the changes you seek among individuals receiving your program.

Up Next:

Step 08. Outcome Evaluation

This step helps with planning an outcome evaluation and using the results from it. An outcome evaluation reveals how well you met the goals and desired outcomes you set for the program in Step 2.

View Step