Evaluate the Implementation Process

Evaluate the Implementation Process

Before describing approaches to planning your process evaluation, we first provide an overview of some of the different types of evaluation that may be most relevant to your project. The term "evaluation" refers to the systematic analysis of data and information related to your work to help you achieve better outcomes for your program and families. Evaluation is a central tool of effective program planning, implementation, and ongoing management, and GTO uses evaluation—mostly in Steps 7 and 8 but also throughout each of the 10 steps—as a way to help you achieve the outcomes you seek.

Types of evaluations

The unit or level of evaluation can vary, and in the case of your program, you are probably interested in evaluation at several levels. The first is population-level evaluation, which refers to analysis of the entire community or the other geographic units that you serve. Your needs assessment in Step 1 is an example of a population-level evaluation. The second level of evaluation that you will want to consider is program-level evaluation, in which you analyze the performance of your program, addressing such issues as whether you are in fact serving your target population and whether you are implementing the home visiting model you selected with fidelity. A third level of evaluation is an outcome- or individual-level evaluation, where you analyze information on the families you serve. This might take the form of examining how long families stay in your program or whether they improve consistent with the goals of the program.

You can find dozens of different types of evaluations described in numerous textbooks and other references. For more detailed information about practical program evaluation approaches, see Chen (2005). While it is not necessary to engage in every type of evaluation for your program, you will want to conduct some evaluation at each of the three levels just outlined. You have already conducted a population-level needs assessment in Step 1, and you will revisit population-level evaluation in Step 8.

In this step, we describe a specific kind of program-level evaluation called process evaluation. In Step 8, we also discuss how to use individual-level outcome evaluation—or the program's impact on individuals—to inform your work. The process and individual-level evaluations together tell the story of your program, and, therefore, GTO-HV recommends doing both. We describe process evaluation in this chapter, Step 7, and individual-level evaluation in Step 8.

Overview of process evaluation

At this point, you have selected your home visiting program and planned how to roll it out. Before you actually implement your program, it will be important to spend some time planning what is called a process evaluation, in which you analyze information at the program level. A process evaluation monitors and documents the program's implementation processes throughout the life of the program.

Process evaluations are designed to give you an idea about how well the program plans were put into action. This can include assessing whether the amount and frequency of services you planned have actually been delivered (i.e., dosage), whether the people who participated in the program have been satisfied with their experience, and the degree to which the program has been carried out with fidelity, or faithfulness to the design of the program developers. Assessing dose, satisfaction, and fidelity can quickly give you important places to make mid-course corrections that will help improve a program's operation.

In the next step (Step 8), you'll work on planning an outcome or individual-level evaluation, which will help you to document the outcomes (for example, reductions in child maltreatment). We encourage you to read and work through the planning portions of both Steps 7 and 8 before launching your home visiting program.

Before implementation: Planning to evaluate the process

A major part of planning your process evaluation is identifying which program elements (or "components" in GTO-HV language) you need to monitor. In addition, a process evaluation plan should be able to answer the following questions:

  • What process evaluation questions should we ask?
  • What tools should we use?
  • What should our data-gathering schedule be?
  • Who is responsible for gathering the information?

Once it is begun, the results of a process evaluation can provide answers to the following questions:

  • What are the characteristics of the families who received the program?
  • How many families participated?
  • How often did families participate?
  • How did families hear about the program?
  • Were the program activities implemented with fidelity?
  • How satisfied were the participants with the activities?
  • What do home visitors and other staff think of the program delivery?

For example, if you planned a program that involves three home visiting sessions to teach parenting and other skills, your process evaluation activities would probably include all of the following:

  • Collecting participant characteristics
  • Tracking the number of home visits received by each family to see whether most families received all three sessions
  • Collecting information from the home visitors to make sure that they covered all the session material during the home visits
  • Conducting satisfaction surveys with families during and/or after the program to see what they thought of the program
  • Surveying home visitors to see whether they feel that the participants were engaged.

Elements of data collection

There are many approaches to conducting a process evaluation. The program model you select may require that you collect additional data elements, but we highly recommend that you collect the following types of process evaluation data:

  • Participation—Keep track of each participant over time by creating a roster and list the dates of each session for each participant. You may also want to keep track of some characteristics about your participants, such as age, gender, marital status, and referral source. These data should be kept confidential.
  • Fidelity monitoring—Check to see that the program is being delivered as intended. This includes determining whether home visitors received the appropriate training and technical support, whether participants are receiving the appropriate number of sessions specified by the program, and whether home visitors are touching on all of the required topics during their visits. If you are using an evidence-based program, fidelity measures have likely already been created. Check with the program representative—there may be certain data elements that you are required to collect in order to be allowed to implement the program.
  • Monitor your work plan—You should be following your work plan as you implement your program. You can use the work plan you created in Step 6 to track the completion of your activities.

If you have the resources and time, we suggest you also collect the following information on your program's implementation:

  • Participant satisfaction—Participants' perceptions of your program can be collected using brief surveys.
  • Feedback through focus groups—Focus groups are a good way to solicit feedback on program satisfaction and gather suggestions for improvement. Focus group participants might include participating families, staff of referring organizations, the home visitors, or other individuals.
  • Staff perceptions—Solicit ideas from your home visitors, supervisors, and other staff on perceived successes and challenges in implementing your program. This will help you to identify which factors facilitated the program's implementation and which factors may have emerged as barriers. The information can be tracked over time to determine whether the barriers identified have been adequately addressed.

Note that technology such as computers in the form of portable tablets can be very helpful in streamlining data collection and survey responses, especially if staff are stretched thin.

We have assembled process evaluation data-gathering tips to help you think through ways to gather the types of process evaluation information that we noted above.

Tip 7-1. Types of Process Evaluation Information

You are likely to use a variety of methods for collecting your process evaluation data. Here's some additional information about a few key ones we've mentioned in this chapter.

Participant data

  • What it is: Specific information about participants, including variables such as age, sex, race/ethnicity, education level, household income, family size, referral source, etc.
  • How to gather it: You have probably already gathered much of this kind of information in the course of planning for, establishing, or running your program. Often, these types of questions are asked as part of an intake to a service or an outcome assessment survey. Information can be gathered during an interview with each participant as well.
  • Why it is important: So you'll know whether your program is serving the targeted population and whether program outreach efforts are working to engage the participants you planned to reach.

Focus groups

  • What they are: Focus groups are facilitator-led discussions on a specific topic with a group of no more than 6–12 participants brought together to share their opinions on that topic.
  • How to manage them: Generally, focus groups are led by 1–2 facilitators who ask the group a limited number of questions. Think of the structure of a focus group like a funnel—each major topic should start with broad questions, then get more specific. Be sure to audio- or video-record the focus group or have a designated note taker. The data can be analyzed by looking for the themes that appear in the transcripts or notes, and what sort of range those themes have. If you want more information on focus groups, some good resources to reference are:
  • Morgan, DL, & Krueger, RA. (1997). The Focus Group Kit. Thousand Oaks, CA: SAGE Publications. Description available at http://www.sagepub.com
  • Why they're important: Focus groups are an excellent method to learn what people thought about your program and get suggestions about your program. Data from focus groups often yield "qualitative" (i.e., text) data as opposed to surveys, which usually yield "quantitative" (i.e., numerical) data. Listening as people share and compare their different points of view provides a wealth of information—not just about what they think, but why they think the way they do. For more information about qualitative data collection, refer back to the "Collecting your own data" section in Step 1.

Satisfaction surveys

  • What they are: Information about whether the participants enjoyed the program, whether they got something out of it, and whether the program met their needs or expectations.
  • How to do them: The easiest way is to administer brief surveys to participants as part of the program, at the end of each session or activity. This is better than waiting until the end of the entire program, because sometimes participants forget details from earlier sessions. However, the surveys should be administered so that respondents feel comfortable that their responses will be kept confidential (i.e., service providers do not administer and collect responses). Surveys can also be handed out at the end of a program with self-addressed, stamped envelopes so the participants can complete the survey and return it later. This method, however, adds expense (cost of postage), and often fewer surveys are returned. If you are using a packaged program, the program developers may require that you use a certain questionnaire with the program participants.
  • Why they're important: So you'll know whether the participants feel good about the program; satisfaction surveys can also help you identify areas to improve participant satisfaction, which would be likely to improve retention in the program.

Staff perception data

  • What they are: Staff perceptions about what worked and didn't work during the implementation of a program. Also, you may want to collect staff perceptions about training and supervision quality.
  • How to gather them: There are three methods for gathering data on staff perspectives: focus groups, interviews, and program debriefing.

In addition to what we've already mentioned about focus groups, an interview can be a good way to get detailed information about program implementation. While interviews with staff involve a similar type of questioning as a focus group, you are talking with one person at a time.

A program debriefing is a straightforward way for staff to quickly meet immediately after a program session has been conducted and answer two questions:

  1. What went well in the session?
  2. What didn't go so well, and how can we improve it next time?
  • Why they're important: Program staff are often in an excellent position to comment on how well a program is being implemented and may have ideas for improvement.

Fidelity monitoring

  • What it is: Systematically tracking how closely each intervention activity was implemented as laid out in your final work plan. This includes how much of a program was administered ("dose") and whether it was administered according to the program developer's intentions.
  • How to do it: If you are using a packaged program, check with those responsible for disseminating the program to see if they have a fidelity instrument, and make sure to obtain the scoring criteria. If a fidelity instrument does not come with the packaged program materials or you have developed your own program, look at fidelity tools from other programs and create your own.
  • Why it is important: The closer you can come to implementing a program as it was intended, the better chance you have of achieving your goals and outcomes.

Adapted from Hannah, McCarthy, & Chinman, 2011.

Develop a Process Evaluation Plan

The Process Evaluation Planning Tool in this section can help you organize your process evaluation plan. The tool has two worksheets, the Process Evaluation Planning and Plan for Making Mid-Course Corrections.

Tool 7-1. Process Evaluation Planning

Instructions for using the Process Evaluation Planning Tool:

  1. Have your work plan and program materials (i.e., guide or manual if available) in front of you as well as Tip 7-1 to help you complete the planning tool.
  2. Starting with the first question on the Process Evaluation Planning Tool, fill in:
  3. Which evaluation tools/methods you plan to use (e.g., surveys, focus groups, etc.)
  4. Your anticipated schedule for completion (e.g., annually, quarterly, etc.)
  5. The person or persons responsible for gathering the data for each question
  6. Person or persons responsible for aggregating or analyzing data.

3. Repeat this process for each question.

4. You may want to add questions of your own. This will depend on the program and the resources at your disposal.

5. Once you have finalized your process evaluation plan, take some time to think through when and how the results will be analyzed and used for mid-course corrections. Fill in the second worksheet in the tool, the Plan for Making Mid-Course Corrections Worksheet, with your plan for making mid-course corrections. Working mid-course corrections into your plan now will help your coalition and staff to realize that implementation and evaluation are ongoing processes, and that there is a plan to remedy any issues early. This plan will be used as a part of your Continuous Quality Improvement process, which is discussed in Step 9.

Worksheet 1: Process Evaluation Planning

View in New Window Download Microsoft Word Version

Worksheet 2: Planning for Mid-Course Corrections

View in New Window Download Microsoft Word Version

Townville Example 7-1. Townville's Process Evaluation Plan

Townville's community coalition worked closely with the national office for the XYZ model in establishing their process evaluation plan. The XYZ program has already developed a fidelity instrument, and the national office had some suggestions for how fidelity and other data should be collected. Most importantly, many of the data elements were going to be collected by home visitors using a tablet to input data elements into a database housed at the lead agency.

The worksheet that coalition members filled out as part of their planning appears below.

Worksheet 1: Process Evaluation Planning

Process evaluation questions

Process evaluation tool/method

Schedule of completion

Person
responsible

Did the program follow the basic plan for service delivery?

Home visiting program fidelity monitoring instrument, uses data entered by home visitors into database system

Ongoing

Home visitors, administrative staff monitor that it is being done

What are the program participants' characteristics?

Home visitors will collect/enter participant characteristic information on an enrollment form at the first visit. Some will also come from referring hospitals. This will be integrated into data collection.

Ongoing

Home visitors, hospital and CPS partners, administrative staff monitor that it is being done

Are the participants satisfied with the program?

Satisfaction survey midway and after completion of the program

After all 3 home visits are received

Administrative staff will send survey to participants

External consultant will analyze data

What is the staff's perception of the program?

Internal Internet survey of administrative staff and home visitors

Every 6 months, starting in September

Lead agency Executive Director and an external consultant will analyze the data

How much of the program did each participant receive (e.g., number of home visits)?

Home visitors will enter into the database every time that they visit a family.

Ongoing

Home visitors, administrative staff monitor that it is being done

External consultant will analyze data

What were the program components' levels of quality (e.g., did home visitors follow the curriculum in each visit)?

Was the program implemented with fidelity?

Home visiting program fidelity monitoring instrument, uses data entered by home visitors into database system

Ongoing

Home visitors, administrative staff monitor that it is being done

External consultant will analyze data

Worksheet 2: Planning for Mid-Course Corrections

Who will analyze process evaluation data?

External consultant with expertise in this program will analyze data entered into the database and the survey/focus group data.

How frequently will they analyze the data?

Every quarter

When will data be presented to broader team?

Each quarter the data will be presented to Executive Director, and subsequently to coalition members at coalition meeting.

Who will decide whether mid-course corrections are needed?

Executive Director and the community coalition will decide, working with program model contacts.

What steps will be taken if corrections are needed?

Community coalition will discuss any deviations from expectations and make an action plan for next steps.

Ready to implement: Conduct a process evaluation and analyze the results

Once you have developed your plan, you'll be ready to conduct your process evaluation. This will happen during your program implementation. For example, you will be gathering information about the target populations' characteristics, the amount and fidelity of the program delivery, and the perceptions about the program from the participants and staff. Once you have the process evaluation data, you should be ready to make mid-course corrections if necessary using the plan for mid-course corrections.

We recommend—again—that you take time to read through Step 8 (and the rest of the manual) before you implement your program. This will help you consider whether you need to collect information from or about program participants before participants receive the home visiting program.

Checklist 7-1. Completion of Step 7

When you finish working on this step, you should have done the following:

  • Developed a clear process evaluation plan prior to launching your program, including a plan for:
  • Tracking the number of participants and their attendance.
  • Monitoring your program fidelity.
  • Determining the quality of your activities.
  • Developed a plan for making mid-course corrections if necessary.

Before moving on to Step 8

Once you've finished your process evaluation plan, you are ready to move on to Step 8, in which you'll plan your outcome evaluation to examine whether you are achieving the changes you seek for individuals and your community.