Sep 28, 2022
This series of three videos guides viewers through the process of using Getting To Outcomes® (GTO) to strengthen sexual assault prevention. The first video is an overview of the GTO process. The second video describes how to use GTO to plan your sexual assault prevention activities. In the third video, the viewer will learn how to use GTO to evaluate and improve your sexual assault prevention activities.
Joie D. Acosta
Hello, and thank you for accessing this video, the first of three videos about using Getting To Outcomes to strengthen sexual assault prevention. I'm Joie Acosta with RAND, and I'll be with you throughout this session. Note: These slides are copyrighted training materials for GTO. Any reproduction or distribution is prohibited.
So, welcome. Our goal with this video is to give you an overview of the Getting To Outcomes process that you can use to create prevention plans for implementing and evaluating sexual assault prevention. Matt Chinman, one of the original developers of GTO and a senior researcher at RAND and the VA, will talk about GTO. After Matt, I'll come back and review the GTO steps with you. So, Matt, over to you.
Thanks, Joie. So, what is Getting To Outcomes, exactly? We often say that GTO is not rocket science. It's just a systematic, common-sense method to help create and implement high-quality plans. GTO was developed to help close the gap between the way programs are developed to run in ideal settings and the way they are actually run in the real world.
First, GTO is a ten-step approach or formula for planning, implementing, and evaluating plans in order to achieve success. Second, it is also a series of how-tos with written guides, tools, examples, training, and ongoing support. GTO has been proven to increase planning staff knowledge and skills and improve how they conduct the ten steps. So, it's sort of a road map for how you stand up and run prevention activities and get outcomes in the real world. GTO can help produce a high-quality prevention plan because it helps you think through all the constraints and contingencies upfront and get feedback through evaluation to continuously improve what you're doing.
The GTO formula, if you will, can be applied to any planning process or planning any kind of sexual assault prevention policy, program, practice, or what we call more broadly activities in the GTO guide and handbook. The first six steps prepare you to implement sexual assault prevention. Then, the prevention activities get delivered with the planning to support it. The last steps use results from evaluation to improve the ongoing or next round of your prevention activities.
Let me touch quickly on the whole series of steps. Step 1 helps identify and document the problems of sexual assault at your site and existing resources to address them. Step 2 is identifying goals, target audience, and desired outcomes to address the problem you have decided to tackle after doing Step 1.
Step 3 is identifying best practices for Getting To Outcomes. GTO has a heavy emphasis on really trying to find prevention activities that have some evidence base to them. And we know that evidence is not necessarily a black-and-white thing where it's either evidence-based or not. Rather, it's more of a continuum. But using prevention activities that are as evidence-based as possible can help you get the results you were looking for.
Often, a prevention activity needs to be modified to fit within your setting, so we have a step about how to do that, too. Then, we have a whole piece on readiness or capacities needed to implement prevention activities. We know readiness has a certain connotation in the military: military readiness. This is a slightly different kind of readiness. It's a readiness to take on prevention and conduct it as intended. So, that requires a whole bunch of capacity that we can build by using the Getting To Outcomes process.
And then the final step in planning, Step 6, is making a detailed work plan or a very concrete "who, what, where, when" kind of written plan, including a budget and process and outcome evaluation plan.
Then you start delivering the sexual assault prevention activity. For example, a program like Green Dot or Bringing in the Bystander gets delivered or a policy or practice gets changed. The second batch of GTO steps, called evaluate and improve, are all about tracking how it went and trying to make it better.
So, Step 7 is monitoring—sort of checking on or evaluating the process of implementing the prevention activity. We call this process evaluation. Outcome evaluation is Step 8, where you see what's changed in the target population for the prevention activity and whether you achieved the desired outcomes you set up in Step 2. Step 9 is how to take all of what you learned about the program from the evaluations and make it better going forward. We call that continuous quality improvement, or CQI.
Step 10 is about sustaining a prevention activity that's really working well, but might be discontinued because a key person left or the funding ran out. We want to try to think as early as possible about how to sustain effective prevention activities so installations can continue benefiting over the long term.
This figure shows a circular process. One step kind of systematically leads to another, and then completed steps provide feedback that can change earlier planning and implementation. The steps do fall in that way. However, it's also true that your earliest thinking—for example, about goal setting—might need to be modified after you complete later steps. So, it doesn't always flow in this perfect, circular way. For example, you might set a goal or a certain desired outcome, and then when you identify a prevention activity, you might say, "we really like this activity, but it's not going to get us quite the outcomes we wanted." So we might need to go back and change the desired outcome or look for a better activity. So, you can see how the steps are logically connected, and you can modify one step based on information from another step. So it's all about making sure that the best prevention activity possible to address the priority problem is selected and delivered well. That planning logically links and aligns the problems, goals, best practices, and evaluation work, and it also helps you apply and improve your work and odds for success at each step.
GTO provides the military with many tools and supports to prevent sexual assault. First, there is an operation guide. You can find this guide on the RAND website displayed in the video. The guide includes tools and instructions for using them. These are worksheets that prompt users to consider the issues involved, like which priority problems to address, how to move forward, and a method for recording decisions as you go through each step. Each chapter in the guide is a different GTO step that includes tips, checklists, and other resources in addition to the instructions and worksheets. All the blank tools are also available in a Microsoft Word file, so you don't have to literally write on paper. You can save drafts, share with a team member or work group, revise, and finalize. You may want to pause here and look through the guide.
There is also a more streamlined handbook available online. This green-line handbook walks through the steps of GTO more quickly for those looking to try GTO out. You may want to pause here and look through either the long guide or the more streamlined online handbook. If you're still wondering how to get started, there is also an onboarding roadmap that offers suggested ways to get started with GTO.
In addition to the tools, we have also trained and provided ongoing coaching to military personnel from across the services and military academies in the GTO process. This series of videos helps to introduce GTO to new personnel or provide a refresher for personnel that attended one of those trainings. Back over to you, Joie, for the example.
Joie D. Acosta
Thanks, Matt. Let's walk through a fictional example so you can see how each step of GTO works.
Colonel McHenry leads an Air Force wing. Over the past year, there have been three reports of sexual assault. Morale is low, and Colonel McHenry needs a solution to unify the wing.
How can Step 1 help? Step 1 can help colonel McHenry choose what to tackle. What are the underlying problems or challenges contributing to her concerns about sexual assault among her airmen? What is already being done to address these problems? Can more be done? Who should be the focus for sexual assault prevention? And what, above all others, are the priority, risk, and protective factors Colonel McHenry should tackle with her wing's sexual assault prevention activities?
To help Colonel McHenry consider these questions, there are a number of tools and tips in Step 1 of the guide. There is a Data Catalog Tool to help Colonel McHenry look at different data sources—things like DEOCS reports, the latest WGRA, and other local data. There's also a Resource Assessment Tool. This tool will help Colonel McHenry consider not just what all the problems are, but what is currently being done to tackle them—if anything—so she builds on rather than duplicates those efforts, especially if they are having positive outcomes. And then triaging the list of problems to set priorities. This last tool kind of brings it all together and will help Colonel McHenry boil down what the key problems are that she should focus her prevention efforts on. In the guide, there's a list of risk and protective factors for sexual assault, links to existing data resources, and tips for how to use this data for planning. This is a good place to pause the video and check out the Step 1 tools. By the way, examples of completed tools for each step are contained in the guide.
The second step can help Colonel McHenry identify her goals, target populations, and desired outcomes. Goals are broad statements of the impact or change that you want to achieve, such as a work climate with respect for all. Then ask, "what change is required to achieve the goal that we want? Are the outcomes that we want specific, measurable, achievable, realistic, and time-based?"
These terms come from an acronym—SMART—and there's a tool to help Colonel McHenry determine whether the outcomes reflect those criteria. SMART is just a way Colonel McHenry can make sure that the benchmarks she sets are established in the best way possible. Then there's the Logic Model Tool. In one eight-by-eleven sheet, Colonel McHenry can lay out her wing's entire prevention approach. This tool is really helpful, for example, as a summary for leadership, because once it's filled out, she can just hand them one page and they get a really good understanding of how she thought about setting up her priorities, what she intends to accomplish, what she's going to do to try to get the desired outcome, and how she will know what progress has been made. So, Step 2 is really important because the desired outcomes that Colonel McHenry sets up here become the benchmarks she'll go after in Step 8, the outcome evaluation step.
Colonel McHenry's goal is to reduce incidents of sexual assault so the wing is mission ready and that she can avoid any escalation of these behaviors. Based on Step 1, Colonel McHenry identifies that most of the incidents of sexual assault occurred after service members were at social events where airmen were drinking heavily together. She decides that one way to lower the risk of sexual assault is to help service members identify when they are in higher risk situations, like a social setting with heavy drinking, and then intervene with one another to lower the risk. So she decides that over the next year she will work towards a 20% increase in airmen's confidence to intervene in a risky situation and to ensure 100% of airmen report they would intervene if they saw another airman in a risky situation over the next year.
She knows the military has tried many ways to encourage service members to intervene in risky situations. But which ones work? Step 3 involves finding existing programs or best practices worth copying. This step can help Colonel McHenry to determine which evidence-based activities can get her to her goals and desired outcomes. There may only be one, or there may be multiple candidate activities to consider.
To answer that question, Colonel McHenry can use the Evidence Synthesis Tool. It prompts users to look at different sources to find evidence or define a prevention activity and then make sure that the activity overlaps with needs. So, that's a tool that can help Colonel McHenry begin to narrow down her choices. The GTO guide contains a list of evidence-based activities to get started. Over our many years working with all different kinds of organizations, we find that this is one of the most challenging steps and that folks often make up activities on their own rather than choose an evidence-based activity. When you do that, you don't know what you're going to get. But when you look at evidence-based activities, you know that they have been tested, so you can be more confident that you'll get the outcomes you want.
In Step 4, Colonel McHenry will consider how she would need to modify the activities or best practices she's considering to fit her airmen's needs. So, in Step 3, she might identify multiple activities as options. Steps 4 and 5 help her narrow the field. And by the time she gets to Step 6, the official planning step, she will have focused in on what she wants to do. But here in Step 4, Colonel McHenry would ask, "how well does this candidate activity fit with my airmen, my military community, and my installation's mission?" Often, an effective activity isn't specific to military installations. They're developed somewhere else, such as a university setting. So, she will probably have to make adaptations so they fit her airmen and her wing's OPSTEMPO schedule and other aspects of the military community.
The Fit Assessment Tool will guide Colonel McHenry through that process. One issue is, if she changes an evidence-based activity like Bringing in the Bystander significantly to make it fit—for example, reducing the content from four and a half hours to two–she runs the risk of losing the benefits that made her choose it in the first place. So, it's really important to make modifications in a very thoughtful way without undercutting what makes something an effective prevention activity. The Fit Assessment Tool and the Culturally Appropriate Checklist Tool are ways to help Colonel McHenry do that.
Colonel McHenry needs the right prevention activity for the problem. After reviewing her options in Step 3 and considering fit in Step 4, she decides that incorporating a promising program called Bringing in the Bystander is the best fit for the wing. Bringing in the Bystander is designed to teach safe prosocial methods for bystander mediation before, during, and after situations in which risk for sexual assault may be prevalent. She decides to adapt Bringing in the Bystander to her wing by changing some of the language used from "students" to "airmen" and the visuals from civilians to airmen, but does not modify the session number or length.
In Step 5, Colonel McHenry will assess whether her wing has the ability to implement the activities she's considering. Sometimes people might have the knowledge and skills but just aren't excited about the activity or don't believe in it. That could kill the activity, because people are not committed to do it. Sometimes people are really motivated to do it, but they don't have everything they need to get it done, like certain knowledge, skills, or specific technologies. So, Colonel McHenry will have to ask whether her wing has the motivation and incentives to implement Bringing in the Bystander and whether they have the capacity to implement it.
The Capacity Assessment Tool lists many things that Colonel McHenry would need for Bringing in the Bystander. That tool will prompt her to say, "okay, if we don't have what we need, how are we going to get it?" The idea is to be completely ready to take on an activity before it starts to give the activity the greatest chance of success. Colonel McHenry may wish to add other ingredients or capacities that you need in order to be ready, and some shown on this tool may not apply in her situation. But, it's a start.
Step 6 is where Colonel McHenry will do the nuts and bolts planning at the detailed level for everything needed to implement her activity—things such as staffing, leadership support, space, recruiting and marketing, scheduling, budget, and evaluation.
The Work Plan Tool prompts her to figure out things like who's going to do which tasks and by what deadline. And then she can start to plan how she will evaluate the effectiveness of her activities, processes, and outcomes. She can't evaluate her activity until it's been put into effect, but she will want to make sure she knows how she is going to do so even before the activity starts. Evaluations can be challenging, so planning ahead is important.
Colonel McHenry's team develops a detailed implementation and evaluation plan so no detail is overlooked and then begin implementing and evaluating Bringing in the Bystander.
Now that Bringing in the Bystander is underway, Step 7 is a process evaluation step. Or, in other words, how well did implementation go? The questions Colonel McHenry and her airmen answer as part of doing the process evaluation are: Who participated? Was it the group you targeted? How often did they participate? Did they find it helpful? What activities were actually implemented versus what was planned? Did you do everything that you had planned to do? Were there mid-course corrections? What was done well? How could it be improved?
A number of tools can help Colonel McHenry answer those questions, including the Process Evaluation Planner Tool and others you've completed in Steps 1–6. Then, in Step 7, we have a Process Evaluation Results Tool where you put all the results from your process evaluation. It makes a nice summary of what happened with implementation of prevention activities. You can pause the video here if you like and take a look at the process evaluation tool for planning in Step 6 and for recording results in Step 7.
Why is process evaluation important? Because good implementation leads to good outcomes. If Colonel McHenry and her airmen don't have good implementation, then it doesn't matter how effective her activity is. She's not going to get the outcomes that she wants. And that's why it's really important to evaluate the implementation or the process of delivering activities in addition to outcomes.
Following their process evaluation plan, colonel McHenry's airmen monitor how well the plan was followed and how well the Bringing in the Bystander sessions were carried out. Step 8 is outcome evaluation—whether the activity was successful in achieving the change intended. So the question that Colonel McHenry will ask is whether Bringing in the Bystander achieved its desired outcomes, the specific objectives that she set for this activity back in Step 2.
To answer this question, Colonel McHenry will use the completed tools from Steps 1–7. She has the Outcome Evaluation Planner Tool from Step 6 and the Process Evaluation Results Tool from Step 7. The Step 8 Outcome Evaluation Results Tool will be the repository for all the information or results obtained from outcome evaluation. You can see this tool in Step 8 of the guide. Depending on the kind of prevention activity implemented, you'll typically assess service members' knowledge, skills, and capacities before they participate in an activity. Then, assess those things at the end and compare the results to see if they changed in any way. You probably will have set up certain benchmarks for each outcome with a measure that was taken at pre- and at post-implementation. And so those data would be your outcome evaluation results.
Colonel McHenry monitors the data carefully. But across her airmen, there was no significant increase in confidence to intervene in a risky situation. Why didn't the Bringing in the Bystander sessions work?
Step 9 involves continuous quality improvement strategies. You've probably heard of quality improvement before. It started in manufacturing and has been adopted by health and social services. The idea is that we're always trying to get better by constantly looking for gaps and trying to plug them. The question to ask is how these strategies will be used by Colonel McHenry to learn about and improve Bringing in the Bystander.
So, we have a CQI—or continuous quality improvement—Review Tool that Colonel McHenry can use to go back to all the previous steps and revisit what happened. Did her airmen have the best possible plans? Were her goals realistic? It then prompts her to look at evaluation data to see what the data is telling her. Did her airmen have good implementation? How much did her outcomes change? And did they move in the direction she wanted them to? She will need the Step 7 and 8 tools for this process. Then, the tool prompts her to make an improvement plan for the next time she runs Bringing in the Bystander. This is a really, really critical step. If the prevention activity did not get good outcomes, ask whether it had no evidence base to begin with or was highly modified from the original, because even if it was well implemented, it may mean that a different activity is needed to achieve the change she was looking for. On the other hand, if it is a solid prevention activity, the improvement work needed may be to improve things like the process of recruiting and retention of the participants, or training for staff, or reach of the activity across the installation. Evaluation will be needed to see what difference the improvement activities make for your outcomes.
Colonel McHenry reviews the completed CQI Tool to make a plan about whether to continue implementing Bringing in the Bystander. In reviewing the CQI Tool, Colonel McHenry learns that one of the three Bringing in the Bystander sessions was not completed because they could not find an open training space to deliver the session at the scheduled time. Colonel McHenry decides to give Bringing in the Bystander another chance, and her team tinkers with the schedule of their training space to ensure all three of the sessions can be delivered the next time around.
Continuous quality improvement does the trick. The wing had a 23% increase in their confidence to intervene in a risky situation after participating in Bringing in the Bystander. The team will evaluate the training sessions again in 12 months when another round of Bringing in the Bystander is planned.
Finally, Step 10 is about sustainability. So, if the program is successful, how will Colonel McHenry keep it going? How will she keep leadership supporting it? Who's going to run it? And who is its champion for the future if Colonel McHenry leaves?
She can answer these questions with the Sustainability Review Tool, which lays out a series of tasks that have been proven to further sustainability—things like writing a prevention activity into people's job descriptions and designating someone to run the prevention activity, having a defined line item for a budget that funds the activity. So, it prompts Colonel McHenry to think about that and make a plan for sustaining Bringing in the Bystander.
That's the final step in the GTO process. Though, as you now know, it's a virtuous cycle in that it's continuous as long as the activity is in place. So, Colonel McHenry and her airmen will continue evaluation and CQI the next time the prevention activity is run.
Colonel McHenry's evaluation demonstrates Bringing in the Bystander's value, so her superiors give her the support and resources she needs to sustain and improve it for the duration of her command. Colonel McHenry also engages a civilian employee that works on prevention with her wing to champion the prevention activity so when her command is over, it can continue.
Preventing sexual assault is complex and requires many solutions. For each prevention activity you select, you can use GTO to help you plan, implement, and evaluate prevention activities. The GTO guide has a list of effective prevention activities that aim to build healthy relationships, prevent perpetration, improve awareness through social marketing, increase bystander intervention, empower women, and prevent alcohol misuse. Together, these evidence-based prevention activities can help your installation get to outcomes. That will reduce and prevent sexual assault from occurring at your installation.
As next steps, we'd advise taking the next two training videos that go into more detail on the GTO steps and filling out the tools. If you want to learn more about GTO, jot down this link to GTO's home page on the RAND Corporation's website. You'll find lots of information about GTO as well as free downloadable manuals on using GTO for drug abuse, teen pregnancy, homelessness—things that might not relate to what you're doing at your installations but that might provide some ideas for how GTO works. You can download all the GTO materials we created specifically for sexual assault prevention in the military from the RAND website.
Thank you for accessing this video, the first of three videos about using Getting to Outcomes to strengthen sexual assault prevention.Back to top ⤴
Joie D. Acosta
Hello and thank you for accessing this video, the second of three in the series on using Getting To Outcomes to strengthen sexual assault prevention. I'm Joie Acosta, a developer of GTO and a senior researcher at RAND, and I will be with you throughout this session. Note: These slides are copyrighted training materials for GTO. Any reproduction or distribution is prohibited.
So, welcome. Our overall goal with this three-part video series is to teach how to use GTO to plan, implement, and evaluate high-quality sexual assault prevention in your setting. Our goal with this video is to teach you to use the Getting To Outcomes process to plan your sexual assault prevention activities.
So, in Video 1, we presented an overview of GTO. In this video—number two—we will go over Steps 1–6 of GTO, which address the various tasks involved in preparing your prevention activity, such as looking at needs and resources data, setting goals, picking best practices, and planning evaluation. The focus in Video 3 is on the steps 7–10 of GTO, which includes understanding your evaluation data, using your data to improve your prevention activity, and then sustaining your prevention activity.
The training video is intended for people who need to work on existing or develop new sexual assault prevention plans. So, some of you may be reviewing your installation's existing prevention plan using GTO to make changes or improve it and carry on the work of implementing it. Others of you may be preparing a new prevention plan using GTO, and this video is going to show you how to do that. For those of you who are ready to evaluate the process and outcomes included in your prevention plan, our third video will cover GTO Steps 7–10.
While viewing this training video, you should have these things on hand: the GTO Guide for Strengthening Sexual Assault Prevention in the Military; an electronic copy of the GTO tools, which you can fill in and edit. These resources are all available free and online at www.rand.org. See the website address in this video for a link or Google "GTO for sexual assault prevention." If you have a prevention plan already, have it at hand too, along with any completed GTO tools, so you can review the tools as we go through this video.
As we go through the steps, we'll point you to places in the GTO guide where you can find instructions for completing the GTO tools. You can practice using blank versions of the tools, and we'll point out tips and lots of other resources. The GTO guide has several editions, so instead of using page numbers, just follow the color coding for each step in the guide, which matches this video. The guide is divided into ten steps of GTO and includes for each step an explanation of what it is and why it is important, instructions and blank tools or worksheets, tips and resources and definitions of key terms used in GTO to help you complete the tools, and checklists to review what needs to be completed for each step. The guide contains examples of completed tools for a hypothetical scenario at a fictitious bace that was working with GTO. You can follow their hypothetical scenario through each of the GTO steps.
Looking at the big picture, what is Getting To Outcomes? We often say that GTO is not rocket science. It's just a systematic, common-sense method to help create and implement high-quality plans. GTO was developed to help close the gap between the way prevention activities are developed to run in ideal settings and the way they actually run in the real world. To obtain an overview, watch Training Video 1 that I mentioned earlier.
I don't have to tell anyone in the military that no takeoff happens without a plan. Like a flight plan, a weather briefing, a crew briefing, and lots of checklists go into every takeoff. So GTO helps put this plan together for the prevention initiatives you decide to include in your prevention plan.
Throughout this video, we will also make links to DoD's Prevention Plan of Action, or PPOA. This is the model that DoD is using to organize how prevention ought to be accomplished. This model references a prevention process that GTO could help installations implement and a prevention infrastructure that can be used to support the prevention process as part of an overall prevention system.
To orient you, here are the GTO ten steps, which collectively address all the tasks for running a prevention activity well. GTO is a natural fit for the sexual assault prevention because it helps you think through the constraints and contingencies upfront and then get feedback through evaluation to continuously improve what you are doing. It is all about making sure that the best prevention activity possible is delivered. This can be a program, a policy, a process, or a practice.
You can group GTO's ten steps into two categories. Steps 1–6 focus on planning everything you need for a prevention activity to be successful. With these steps completed, you are ready to implement or deliver your prevention activity. Steps 7–10 guide you through using evaluation and improving the delivery for next time. You need to do Steps 1–6 before implementing your prevention activity.
Now we're going to briefly touch on each step. Let's start with GTO Step 1. In this step, we'll document the problems of sexual assault at your site and existing resources to address them. GTO Step 1 directly addresses the key components of DoD's Prevention Plan of Action—in particular, understanding the problem. So ask your team: What are the underlying problems or challenges contributing to sexual assault among service members and their families on your installation? What is already being done to address those problems? Can more be done? Who are you trying to reach? And what, above all, are the priority problems you want to include in your prevention plan?
To help you consider these questions, there are a number of tools and tips in Step 1 of the guide. There's a Data Catalog Tool, which prompts you to look at different data sources—things like your DEOCS reports and the latest WGRA data. There's also a Resource Assessment Tool. Consider not just what all the problems are, but what is currently being done to tackle them—if anything—so you don't duplicate those efforts if they're having positive outcomes. The third tool is for triaging the list of problems to set your priorities. This last tool kind of brings it all together and helps you boil down what your key problems are that you want to focus on. And you have in the guide tips on data sources to look at as well as a list of the major risk and protective factors for sexual assault prevention. This is a good place for you to pause the video and check out Tip 1-1 and the three Step 1 tools in the GTO guide.
Step 1 is important because, unless we know the problem, we will not know whether we have solved it or succeeded in tackling it. Understanding conditions; risk-, protective-, and resilience-promoting factors; and current levels of needs tell us what we can and cannot address about a problem. Knowing about existing activities helps save money and avoid duplication. The best information available about the problem helps set reasonable and achievable goals and desired outcomes.
In Step 1, the needs and resources assessment step, there are three major task. Task 1 is documenting the problem. A problems assessment is gathering information about the current conditions in a targeted area or group to help identify priorities for action. How can we define and assess a problem has everything to do with how we determine a solution. Whatever the need, the problem, or the risk is, you have to find out about it in order to know whether how you tackle the problem makes a difference. Task 2 is identifying existing resources. A resource assessment is gathering information about what resources are available to address a particular problem. Task 3 is taking stock of the problems and existing resources and using them to prioritize which problems to focus on.
To document the problem, you start with a question. What are the risk factors, problems, or gaps evident in the target population? Risk factors are certain knowledge, perceptions, or behaviors that are not themselves the target behavior—in this case, actual incidents of sexual assault and harassment—but are known to be related to those behaviors. Thus, if you're able to impact these factors, then you're likely to change the ultimate behavior. An example is the endorsement of rape myths. Research has shown that those who strongly endorse rape myths are more likely to commit sexual assault and harassment.
Protective factors work the same way, but in the opposite direction. For example, a workplace climate that actively discourages sexual harassment makes that behavior less likely. Therefore, to answer this question, you conduct a needs assessment or a review of data to gather information about the current conditions of a targeted area that underlie the need for an intervention. You might look at existing data collected by someone else, like the DEOCS or WGRA, or you might collect new data yourself, like focus groups or interviews.
At the same time, you will want to answer the question, "what resources already exist that the target population has access to and are using?" To answer this question, you conduct a resources assessment, which is a process of gathering information about the resources available to address a particular need. This will involve collecting some data to understand how these resources are addressing risk factors within the target population and how effective they are.
The final task in Step 1 is to prioritize which problems to tackle. To do this, you determine which problems you are best suited to tackle now—for example, which are not being addressed anywhere else—and then hone in on those that you have sufficient resources to address.
To help you with these tasks, GTO Step 1 has three tools and three tips. Let's quickly go through those now. There is a Data Catalog Tool, a Community Resource Assessment Tool, and a Triaging Among Problems Tool. Appendix B contains places where you can find data on military sexual assault. There are also three tips. Tip 1-1 contains risk and protective factors for sexual assault victimization and perpetration. Tip 1-2 contains links to existing data sources that you can use to inform your problems assessment. Tip 1-3 provides tips for using data to inform your problem assessment.
Throughout these videos, we will be presenting examples of various GTO tasks having been completed by a small team of service members at a fictitious military base called Joint Base Hensonburg. The examples of completed GTO tools presented here relate to a hypothetical scenario at Joint Base Hensonburg. We have found that it can be helpful for those new to GTO to see how the GTO tools are completed.
Here is a snippet from the Data Catalog Tool. In this example, leaders at Joint Base Hensonburg have been tracking an increase in incidents of sexual assault, and DEOCS data show that few service members recognize high-risk situations.
The Community Resources Assessment Tool shows that one key resource, the annual sexual assault prevention training, has been well attended. But if incidents are going up, the training may not be having the desired impact.
The Triaging Among Problems Tool shows that the team at Joint Base Hensonburg believe they can tackle the issue of worsening assault among junior enlisted after reviewing the problem within the tool. More information about this example can be found in each step of the GTO guide.
GTO uses logic models, which are a picture-y summary of your prevention activity's key elements taken from certain GTO steps. As you can see here, you can begin to fill in your logic model after Step 1. Here, you're asked to briefly summarize the main problem using the various data from the three tools in this step: Data Catalog, the Resources Assessment Tool, and the Triaging Among Problems Tool. Here, Joint Base Hensonburg focused on the low ability to perceive high-risk situations and an important risk factor for their increases in rates of sexual assault. We will continue to fill in our logic model as we progress through the steps. Before we move on to Step 2, now is a good time to pause this video and read how Joint Base Hensonburg or your own installation filled out the GTO Step 1 tools.
The second step is identifying goals, target populations, and desired outcomes. You now understand and have documented the level of your priority problem as a result of Step 1. In this step, you set broad and specific benchmarks for gauging progress in reducing the problem. This step directly addresses developing a comprehensive approach in the Prevention Plan of Action.
First, consider what kind of a goal or broad change that you want to see for addressing that problem and for whom you will target your effort. For example, service members on your installation or just junior enlisted service members. Then, get more detailed and think through what will change, by how much, for whom, and by when. In other words, desired outcomes must be specific. Consider whether your desired outcome is realistic and measurable. Those terms form an acronym—SMART—and there's a tool to help you determine whether the outcomes reflect those criteria. SMART is just a way to make sure that when you set up your benchmarks, you are setting them up in the best way possible so goals and desired outcomes are informed by your information on problems and resources. Why are they important? They are the only way to know whether you're making a difference for the prevention activity participants. They will enable you to tell your stakeholders what you're trying to do and how well it is going. Specifying the changes you expect and the target population helps to inform what kinds of prevention activities you should select to implement. And goals and desired outcomes identify the outcome data you'll need in an evaluation. But how to decide on a realistic goal? This is a good time to pause and look through Tip 2-1 and the Step 2 tools in the GTO guide: the Smart Desired Outcomes Tool and the Logic Model Tool.
Why do goals, audience, and desired outcomes matter? Bottom line up front: if you don't have goals and desired outcomes, you'll never know if you've reached them. Goals are broad statements of the impact we're trying to have with our prevention activity in the target group. What difference do we expect our prevention activity to make? Sometimes practitioners say their goal is to run x activity. Not in GTO. It doesn't lead into outcome evaluation. But some example goals are to address the problem that few service members recognized one or more high-risk situations for sexual assault in the past 12 months. To continue addressing the problem that among service members who do recognize a high-risk situation for sexual assault, most took action to reduce the risk.
Desired outcomes are specific statements of what will change, for whom, by how much, and when. Outcomes can relate to knowledge, attitudes, skills, or behaviors. SMART desired outcomes are specific, measurable, achievable, realistic, and time-bound. Tip 2-1 in the GTO guide gives more detail on each of these. You can pause here and find Tip 2-1 in Step 2, the pink section of the guide.
To help you with these tasks, Step 2 has a few tools and tips to help you complete the tools with quality. There are four tools: a goals and desired outcomes worksheet, a SMART Desired Outcomes Tool, a Prevention Activity Logic Model Tool, and a tool to help prepare a progress briefing for your leadership. Engaging leadership is an important part of the GTO process. Tip 2-1 shows you how to check that your desired outcomes are SMART.
Here's a rough overview of how to create goals and desired outcomes using a GTO Step 2 tool. For sexual assault prevention at Joint Base Hensonburg, the goal is to improve service members' ability to identify risky situations. Again, a goal describes broadly what will be better or improved. Running a prevention activity is not a goal.
Now, let's make a SMART desired outcome for this goal. "S" stands for specific and describes what will change. At Joint Base Hensonburg, service members who identify a high-risk situation will change. "M" stands for measurable, or how much change is expected and on what measure. For Joint Base Hensonburg, a 15% change is expected in the number of junior enlisted service members who can identify at least one high-risk behavior. The Banyard, Moynihan, and Plante measure of high-risk behaviors provides a reliable way to assess this over time. "T" stands for time, or by when the change is expected. Change is expected within three years of program implementation.
The S, M, and T of the SMART acronym get combined to make a full desired outcome statement. The color coding here helps you see where each go to make the statement. Within three years of program implementation, 50% of junior enlisted service members will report that they identified at least one high-risk situation for sexual assault in the past two months.
"A" stands for achievable. Is it possible? And "R" stands for realistic. Does it make logical sense? While these don't show up in the outcome statements, they are important considerations to keep in mind when setting your desired outcomes.
Let's turn back to the Prevention Activity Planning Logic Model. What did you find? What are your priority problems? As you can see here, you can continue to fill in your logic model after Step 2. Here you are asked to briefly summarize your goal and desired outcomes. You may have more than one goal and more than one desired outcome, but each goal should have at least one desired outcome. Here you can see Joint Base Hensonburg's desired outcome. We will continue to fill in our logic model as we progress through the steps. Before we move on to Step 3, now is a good time to pause this video and read how Joint Base Hensonburg or your own installation filled out the GTO Step 2 tools.
The third step helps you identify best practices to use to develop a comprehensive approach to prevention at your installation by asking the question, "which evidence-based programs or strategies can help you reach your goals and desired outcomes?" To answer this question, you'll use the Evidence Synthesis Tool. It prompts you to look at different sources to find evidence or define a prevention activity and then make sure that the prevention activity overlaps with your needs. So, this is a tool that can help you begin to narrow down the prevention activity choices. Over our many years working with all different kinds of organizations, we find that this is one of the most challenging steps and that folks often make up programs on their own or rely on prevention activities that they've been doing over and over. When you do that, you don't know what you're going to get unless you evaluate it rigorously. But when you look at evidence-based approaches, you know that other people have thought about this ahead of time and have tested it and gotten positive changes, so you can be more confident that you will achieve the outcomes you want. You can pause here and thumb through Step 3 of the GTO guide. Note the guidance on understanding and assessing evidence for different approaches. The challenge is that not all evidence is equal. Rather, it is a continuum.
An evidence-based prevention activity, as mentioned earlier, is one that has been proven to work using proven effective prevention activities. It increases the likelihood of achieving goals and desired outcomes, promotes confidence among leadership and stakeholders, provides users with prior experience and evaluation results, and it contributes to a good reputation and value story. An evidence-based activity is a scientifically valid and rigorously tested prevention activity that research has shown to make a difference. Sometimes it's called an EBP. Even if a prevention activity is not an EBP, it may be appropriate if it incorporates best practices in the field.
Research has identified best practices for sexual assault prevention activities—for example, enhancing protective factors and reducing risk factors for sexual assault victimization or perpetration and tailoring prevention to address risks specific to your population. These are just a few of the best practices in sexual assault that we know about. Others can come from online registries like the Penn State Clearinghouse for Military Family Readiness. Colleagues and peer-reviewed journals are another place to locate best practices. In Step 3 of your GTO guide, you will find a list of best practices and guidance for evaluating the evidence for a prevention activity. You'll notice that in the GTO guide we categorize prevention activities along a continuum of evidence. There's not just good or bad activities.
To help with these tasks, Step 3 has a few tools and tips to help you complete the tools with quality. There is only one tool: an Evidence Synthesis Tool. This tool has two parts to help you document the evidence for your prevention activity and then evaluate it. Appendix C contains a list of evidence-based sexual assault prevention activities. Tip 3-1 provides some information on how to locate other EBP sexual assault prevention activities. Tip 3-2 categorizes effective prevention activities so you can select a specific type of prevention activity, such as bystander intervention training. Tip 3-3 contains the principles of effective prevention to help you assess whether your prevention activity adheres to these principles. And finally, Tip 3-4 provides more information on evaluating the level of evidence of a prevention activity using the evidence continuum.
Here is a snippet from the Evidence Synthesis Tool, Part 1. In this example, the prevention team at Joint Base Hensonburg have used three online registries to identify possible prevention activities or candidates that would help them meet their goal and desired outcome. These three sources all identified Bringing in the Bystander and Green Dot as candidates.
Here is a snippet from the Evidence Synthesis Tool, Part 2. In this example, the prevention team at Joint Base Hensonburg are considering the evidence for Bringing in the Bystander, one of their two candidate activities. Before we move on to Step 4, now is a good time to pause this video and read how Joint Base Hensonburg or your installation filled out the GTO Step 3 tools.
The fourth step helps you identify whether the prevention activities you've selected are a good fit for comprehensive prevention at your installation by asking the question, "how well does the candidate activity fit with your target population, your community, and your installation's mission?"
The latter is most important because you need to be able to implement the activity in a setting in a way that service members can take advantage of, not in a way that adds to the burden on their time. We often hear that service members' time is very valuable and that long trainings are challenging to implement. This is an example of a poor fit for most installations. Sometimes a prevention activity does not exactly fit. For example, it was designed for college students, not service members. You can modify the language of a curriculum by changing "students" to "service members." But shaving hours out of a training's content would be inappropriate, because if you change a prevention activity significantly to make it fit, you run the risk of losing the benefits it can achieve that made you choose it in the first place. Tips 4-1 and 4-2 in the guide give you information about the kinds of modifications that you can safely make and those that you should avoid. The Fit Assessment Tool is there to guide you through the process of assessing fit and planning any modifications that are needed to make a prevention activity fit better. Complete this tool for each of the candidate prevention activities you identified in Step 3. So, for Step 4, the bottom line is that it's really important to make sure that an activity fits and to make modifications in a very thoughtful way without undercutting what makes it an effective activity. Eliminate from further consideration any activities that do not fit. Pause the video here and take a look at the tips and the tools in the GTO guide for Step 4.
Prevention activity fit means that you have a good, close match between the prevention activity as designed and your application on three dimensions shown here: population and their needs, community, and installation or academy. Ask, so, what age or pay grade was the prevention activity designed for? Who will you use it with? Ask, what is its approach? Will that work for your community? Will they value it? Ask how well it fits with your installation or academy's mission, other activities, schedule. Do you feel leadership supports it?
So, why is this important to think about? Why do you think program fit matters? Why is this important in GTO? Determining a prevention activity's fit with your target population and their problems, your community, and your installation is an essential step because doing so increases the chance that an activity will be accepted by and is good for the target population; reduces duplication of services; it helps you avoid finding out later that the activity failed because it was a poor fit for the installation, community, or target group; and it helps select the right activity and rule out those with fit issues that can't be resolved.
How do you assess prevention activity fit? Compare what you know about your installation, community, target population, and their problems with what you know about the prevention activity being considered. Talk with stakeholders to see if they are ready to accept the candidate prevention activity. Consider whether the activity is compatible with other activities that you're doing, military culture, and your service member characteristics.
When a prevention activity is a good fit, you should decide to deliver the prevention activity as is. Don't change what isn't broken. In the unlikely case of needing to improve fit, you can make some changes or adaptations, but only if the changes don't destroy what makes the prevention activity effective. Let's look at the guide for making green-, yellow-, and red-light changes and mark it in your manual.
To help you with these tasks, Step 4 has a few tools and tips to help you complete the tools with quality. There are two tools—the Prevention Activity Fit Assessment Tool and the Culturally Appropriate Checklist Tool—to help you assess your prevention activity's fit. Tip 4-1 contains a red, yellow and green stoplight with types of adaptations that are good to go, or green; require caution, or yellow; or should be avoided, or red. Tip 4-2 contains a list of the types of prevention activity adaptations. Feel free to pause here to take a closer look. The blank tool and the example of a completed Step 4 tool is in the GTO guide.
The fifth step helps you identify whether you have the capacities needed to implement the prevention activities you've selected. These prevention capacities include things like data, funds for programing, and leadership support, and are part of the prevention system described in the PPOA. The Capacity Assessment Tool lists all the things that could be needed to carry off a prevention activity really well. This tool prompts you to decide, "okay, if we don't have what we need, how are we going to get it?" The idea is to be doing prevention activities that you have the capacity to carry out, given staffing, competing demands, leadership support, funding, and other considerations. Again, this assessment may show that you should turn to a different activity for which you have the capacity. For example, perhaps one that doesn't have such a large facilitator training component. You want to be completely ready to take on an activity before you start so you have the greatest chance of success. Complete the Step 5 tool for all prevention activities that remain under consideration for inclusion in your prevention plan. You can pause here and take a look at the GTO Step 5 materials, including the definitions of capacity, included in Table 5-1.
Understanding your prevention team's and leadership's readiness or capacity to implement the prevention activity is important because low motivation can contribute to the feeling that a new activity is yet another service member burden of responsibility and that it doesn't really make sense for this installation. Low capacity or resources can cause added burden on helping agency staff and lead to poor activity implementation. Together, sufficient motivation or willingness and support and capacities or resources help you meet the goals and desired outcomes of your prevention plan.
There are different kinds of capacities. Here is a way to categorize them into staff, leadership support, technical support such as technical assistance or coaching, fiscal resources, and other partnerships. The Step 5 tool walks you through assessing numerous capacities and guides you towards addressing those that are important and insufficient. You may identify other capacity concerns to consider in addition to those on the Step 5 tool. Remember that one option is to choose another candidate prevention activity rather than try to implement an activity you don't have the capacity to implement well.
In addition to general capacity, as mentioned earlier, there may be some unique capacities you need to implement a specific prevention activity such as facilitator guides, posters, or other materials. Finally, momentum is a key part of being ready to implement. Being ready to implement includes these multiple components, all critical to success.
To help you with these tasks, Step 5 has a few tools and tips to help you complete the tools with quality. There are two tools: the Capacity Assessment Tool and another set of materials to help you prepare a second leadership briefing. Now that you've got a sense for which prevention activity you want to implement, briefing your leadership will help get leaders engaged early and bought into any prevention activities before they are implemented.
Here's what an important part of Joint Base Hensonburg's completed Capacity Assessment Tool looks like. The complete example is provided in Step 5 of the GTO guide. You might want to pause the video at this point to take a closer look at this completed tool.
The goal in this step is to develop a prevention plan that prepares your team to implement, monitor, and evaluate the prevention activities that contribute to your installation's comprehensive approach. We have a number of tools, as shown here, to guide your detailed planning process. In Step 6, you can also revisit and finalize the logic model tool that you started in Step 2. I'll talk more about the process and outcome evaluation planning in a few minutes. First, let's think about why a detailed implementation plan is important.
This shows the benefits of creating a detailed work plan. To some people, detailed plans may sometimes seem unnecessary, especially when an activity has been implemented numerous times in the past. But detailed work plans are worthwhile because they produce documentation needed for new or changing staff to take over implementation or activity supervision. And the documentation is also useful when evaluating the implementation process to see if it went according to plan.
Before you start working on your plan, make sure the final prevention activity that you're actually going to use is updated on the logic model tool that you started in Step 2 and adjust the desired outcomes if the prevention activity produces different desired outcomes than those that you initially chose. Now you're ready to use the Work Plan Tool to capture the details of your prevention activity implementation plan. Feel free to add to or delete tasks from those listed on the tool. Planning does involve the details, so try to have everyone involved take part and produce a comprehensive plan. You can do the work plan section on evaluation in tandem with the evaluation planner tools that are also part of Step 6. The Budget Tool comes next and serves as the location for all of your cost information related to implementation.
Let's turn back to the Prevention Activity Planning Logic Model. Based on the fit and capacity assessment, Joint Base Hensonburg has selected Green Dot to implement.
The prevention team at Joint Base Hensonburg assigned tasks, personnel responsible, and deadlines in the Work Plan Tool. It should be used during implementation to note in the final column when tasks are actually done, which can help planners adjust deadlines in the future. Pause here and take a closer look at Joint Base Hensonburg's example Work Plan and Budget Tool in the GTO guide. It shows the kind of detail that should be included in prevention activity work plans.
Step 6 is where you plan your evaluation. In Steps 7 and 8, you will report the evaluation results. Let's start by understanding the two different kinds of activity evaluation: process or implementation evaluation, as it is sometimes called, versus outcome or impact evaluation. As shown here and in Tip 6-1, process evaluations track several aspects of the quality of a prevention activity's implementation—like attendance, presentation quality, participant satisfaction, and staff perceptions of the process. Outcome evaluations track the change in participants attributable to the prevention activity. Let's talk about each type of evaluation in turn.
So, what is a process evaluation? At one level, it is just monitoring how implementation of the prevention activity is going. It tells you how well your plans are being put into place; whether you're on track—for example, with the components shown here. You and your team will need to decide which of these components you include in your process evaluation.
This shows several benefits of including a process evaluation. The last point may need some clarification. Often, prevention activities do not achieve the desired outcomes, and everyone wonders why. It may have something to do with how the activity was implemented rather than the content of the activity or the participants. For example, a poor presenter who cuts out most of the content may have diluted the potential that the activity had to change knowledge, skills, or behavior among the participants. When implementation is poor, good outcomes should not be expected. The process evaluation gives you the information you need to assess quality of implementation and helps you explain the outcome evaluation results.
Here are several typical questions people have about the process of implementation. Some or all may be important in your case. Recall that Appendix C of the guide provides information about process evaluation methods.
Joint Base Hensonburg's Process Evaluation Planner Tool shows what methods and data will be used to evaluate their prevention activity, Green Dot. Feel free to pause the video to see how Joint Base Hensonburg's team filled out this tool. While process evaluation is important, evaluations should not stop with a process evaluation only.
So what, then, is an outcome evaluation? An outcome evaluation answers important questions that skeptics and naysayers, as well as facilitators and other stakeholders, often ask. Does the prevention activity make a difference in the target population? Were the goals and desired outcomes actually achieved?
Outcome evaluations can be very powerful because they obtain evidence that your activity worked or didn't. Produce facts and numbers that support continued implementation of the activity. Outcome evaluations identify changes that can improve the activity. And when implementation was high quality, but the results are not, the outcome evaluation could suggest that you need a different activity—perhaps one with stronger evidence of its effectiveness when used in settings like yours.
Designing an outcome evaluation includes several key decisions in the planning process. For example, selecting one or more measures of your outcome—like the level of risk a person perceives or the skills needed to intervene as a bystander—followed by data collection, usually before and after implementation, and then analysis of the data.
Let's turn back to the Prevention Activity Planning Logic Model. Process evaluation measures and outcome evaluation measures are now included for the Green Dot activity.
To help you with these tasks, Step 6 has a few tools and tips to help you complete the tools with quality. There are three tools: a Prevention Activity Work Plan Tool, a Process Evaluation Planner Tool, and an Outcome Evaluation Planner Tool. Appendix E contains an overview of process and outcome evaluation methods. Tip 6-1 describes the difference between process and outcome evaluation. And Tips 6-2 and 6-3 contain sample process and outcome measures, respectively.
Joint Base Hensonburg's tool lays out what it will evaluate about Green Dot and an evidence-based way to do so. Pause here and take a look at Joint Base Hensonburg's example in the GTO guide.
As next steps, we'd advise taking the next training video that goes into more detail on the GTO Steps 7–10 and completing the GTO tools. If you want to learn more about GTO, jot down this link to GTO's home page on the RAND Corporation's website. You'll find lots of information about GTO as well as free downloadable manuals on using GTO for drug abuse, teen pregnancy, homelessness—things that might not relate to what you're doing at your installations, but that might provide some ideas for how GTO works.
You can download all the GTO materials we created specifically for sexual assault prevention in the military from the RAND website. And thank you for accessing this video, the second of three videos about using Getting To Outcomes to strengthen sexual assault prevention.Back to top ⤴
Hello and thank you for accessing this video, the third of three in this series on using Getting To Outcomes to strengthen sexual assault prevention. I'm Matthew Chinman, one of the original developers of GTO and a senior researcher at RAND and the VA, and I will be with you throughout this session. Note: These slides are copyrighted training materials for GTO. Any reproduction or distribution is prohibited.
So, welcome. Our goal with this video is to teach you to use Getting To Outcomes to evaluate and improve your sexual assault prevention activities. If you've already seen the first training video, you're familiar with the broad strokes of GTO. The second video explained the first six GTO steps, which includes planning for evaluation in GTO Step 6. This final video explains Steps 7–10, using evaluation results to improve and sustain effective programs, policies, practices, and processes—what we call prevention activities. If you have evaluation plans or, even better, have collected and analyzed evaluation data, this video will help you put the results to work. If you don't yet have an evaluation plan or haven't begun to collect data, we'll review some of Step 6 here, but we recommend that you use video number two first to help you plan and implement your evaluation.
In Videos 1 and 2, you got an overview of GTO and learned about Steps 1–6. Those steps involve understanding the problems, setting goals and desired outcomes, and picking and planning a prevention activity that's a good fit. Our primary focus in this video will be on Steps 7–10 of GTO, which includes understanding your evaluation data, using your data to improve your prevention activity, and then sustaining your prevention activity.
While you're viewing this training video, you should have at least these things on hand: the GTO Guide for Strengthening Sexual Assault Prevention in the Military and an electric copy of the GTO tools, which you can fill in and edit. These resources are all available free and online at www.rand.org. See the website address in this video for a link or Google "GTO for sexual assault prevention." If you have a prevention plan already, have it at hand too, along with any completed GTO tools, so you can review the tools as we go through the video.
As we go through the steps, we'll point you to places in the GTO guide where you can find instructions for completing the tools. You can practice using blank versions of the tools. We'll point out tips and lots of other resources, too. The GTO guide has several editions, so instead of using page numbers, just follow the color coding for each step in the guide, which matches this video. The guide is divided into ten steps of GTO and includes for each step an explanation of what it is and why it's important, instructions and blank tools or worksheets, tips and resources and definitions of key terms used in GTO to help you complete the tools, and a checklist to review what needs to be completed for each step. The guide contains examples of completed tools for a hypothetical scenario at a fictitious base that was working with GTO. You can follow their hypothetical scenario through each of the GTO steps.
Throughout the video, we also make links to DoD's Prevention Plan of Action, or PPOA. This is the model that DoD is using to organize how prevention ought to be accomplished. This model references a prevention process that GTO can help installations implement and a prevention infrastructure that can be used to support the prevention process as a part of an overall prevention system.
To refresh your memory since the first training video, we often say that GTO is not rocket science, but it's just a systematic, common-sense method to help create, implement, and evaluate high-quality prevention activities. GTO was developed to help close the gap between the way programs are developed to run in ideal settings and the way they are actually run in the real world.
First, GTO is a ten-step approach or formula for planning, implementing, and evaluating plans in order to achieve success. We often refer to prevention activities, which includes policies, practices, processes, as well as programs. Second, GTO is also a series of how-tos with written guides, tools, examples, training, and ongoing support. GTO has been proven to increase staff knowledge and skills and improve how they conduct the ten steps. So, it's sort of a roadmap for how you stand up and run prevention activities and get outcomes in the real world. GTO can help produce a high-quality prevention plan because it helps you think through the constraints and the contingencies upfront and get feedback through evaluation to continuously improve what you are doing.
The GTO formula, if you will, can be applied to any planning process or planning any kind of sexual assault prevention policy, program, process, or practice, or what we more broadly call activities in the GTO Guide and Handbook. The first six steps are all about planning the best prevention activity possible and then implementing the planned activity and evaluation. The last steps use results from the evaluation to improve the ongoing or next round of your prevention activities.
The remainder of this video focuses on Steps 7–10. Step 7 is focused on evaluating implementation quality. How did it go? Step 8 is focused on evaluating success in achieving desired outcomes, or effectiveness, of the prevention activity. Step 9 helps you make a plan for continuous quality improvement, and Step 10 helps you make a plan for sustaining effective prevention activities.
So, how does this fit together? Our logic model is a great way to show you how. As you can see, it's intended that there be a logical link between all the key steps from problems to goals and desired outcomes to prevention activities to evaluation. A big part of evaluation is using your evaluation data in something called continuous quality improvement, or CQI. The logic model shows how CQI involves revisiting and revising earlier GTO steps informed by your evaluation results. We will get into more detail about CQI later on in this video.
So, how do installations use GTO to evaluate and improve a prevention activity? First, as a part of Step 6, they plan their process and outcome evaluations. At this point, if you have not yet started your prevention activity, you might want to use GTO Step 6 to plan your evaluation before you continue with this video.
As shown here and in Tip 6.1 from the guide, process evaluations track several aspects of the quality of a prevention activity's implementation—like attendance, presentation quality, participant satisfaction, and staff perceptions of the process. Outcome evaluations track the change in participants attributable to the prevention activity and tells you whether your desired outcomes were achieved. Let's talk about each type of evaluation in turn.
Throughout the GTO guide and these videos, we've been presenting an example from a hypothetical base called Joint Base Hensonburg to show how GTO is used. JB Hensonburg's Process Evaluation Planner Tool shows what methods and data will be used to evaluate their prevention activity, Green Dot, a bystander intervention program. Feel free to pause the video to see how JB Hensonburg's team filled out this tool. While process evaluation is important, evaluation should not stop with a process evaluation only.
JB Hensonburg's tool lays out what it will evaluate about Green Dot and in an evidence-based way. Pause here and take a look at JB Hensonburg's example in the GTO guide.
Now I'd like to draw your attention to several evaluation resources included in the guide. First is Appendix D in the guide, which explains both types of evaluation and describes evaluation methods. Tip D.1, found in Appendix D, provides more information about process evaluation methods, including what, how, and why each of these methods should be used and links to step-by-step instructions for conducting focus groups.
Appendix E contains more information about process evaluation measures using prior evaluations of sexual assault prevention. This appendix describes the measure, contains the specific items from the measure, and lists the measure's name and source.
Appendix D also has relevant tips about outcome evaluation. Tip D.2 describes outcome evaluation data collection methods, including the pros and cons of each and the costs associated with each method. There is also a list of other considerations in Tip D.3 to keep in mind when collecting data, such as confidentiality and anonymity.
A repository of outcome measures is also available for your use in Appendix E.
Okay. Let's get into the steps, starting with Step 7: process evaluation. Remember, a process evaluation or implementation evaluation is simply an evaluation of how well the program ran. Did you do it as planned? And how did it go? In Step 6, you planned your process evaluation. In Step 7, you're going to organize and analyze the results. GTO Step 7, or process evaluation, directly addresses key components of DoD's Prevention Plan of Action—in particular, quality implementation and continuous evaluation.
Step 7 involves evaluating how well a prevention activity was implemented. Did it run according to your plan? And how well did it go? You should ask who participated and how often versus who was targeted. What activities were actually implemented versus what was planned? Were the activities implemented as scheduled and sessions run on time? What was done well? What could be improved? A number of tools can help you answer the questions shown in this chart, including the Process Evaluation Planner Tool and others you completed in Steps 1–6, but they require that you collect the data. Then, we have a Process Evaluation Results Tool in Step 7 where you put all the results from your process evaluation together. It makes a nice summary of what happened with the implementation of your prevention activity.
Why is all of this important? Because good implementation leads to good outcomes. If you don't have good implementation, then it doesn't matter how good your prevention activity is. You're not going to get the outcomes you want. That's why it's really important to evaluate the implementation in addition to the outcomes. This is a good place to pause the training video and check out Step 7: Process Evaluation Results Tool in the GTO guide. You can look back at the Process Evaluation Planner Tool you completed during GTO Step 6 at the same time and also take a look at the process data you collected.
It can seem like a lot of work, so it's worth asking, "why do we need to evaluate the process?" The process evaluation is important because it helps, along with the outcome evaluation, determine whether the prevention activity produced the desired results. In addition, it's important to understand how implementation went in order to interpret the outcome evaluation. As I just mentioned, if the process was significantly flawed, it may explain why the desired outcome was not achieved. If the process was high quality, but the desired outcome was not achieved, it may require a different prevention activity. If the desired outcome was achieved, it's important to know how the activity was delivered so it can be replicated in the future. The results help inform the kinds of improvements that should be made for subsequent rounds of implementation—for example, recruiting more participants or adhering better to the curriculum.
As Joie mentioned earlier, this step is called process evaluation because the data collected tracked the process of the prevention activity's implementation—as opposed to the outcomes experienced by the participants, which are covered in GTO Step 8. Process evaluations of prevention activities that are programs typically track attendance of participants and adherence to the program model. They may also involve asking prevention activity participants or implementers about how well they thought the activity was delivered. A process evaluation should be planned before an activity begins and should continue while the activity is running. This is why the planning of your process evaluation took place in Step 6.
By the time you have come to this step, you should have already completed the GTO Process Evaluation Planner Tool for each prevention activity you included in your prevention plan—GTO Step 6. Also, you should have executed your process evaluation plans carefully. Failure to follow through with the data collection you plan could undermine your ability to improve the prevention activity over time. Once you have collected the data called for in your evaluation plan, complete the Process Evaluation Results Summary Tool for each activity you have implemented and consider changes needed to improve the activities for the future based on your process evaluation results. If an activity is continuously run and does not show a good process evaluation result early on—for example, poor satisfaction or poor policy awareness—you will need to identify a time when you can make a change to how you run that activity going forward. For examples of completed tools, refer to Joint Base Hensonburg example in the GTO guide at the website shown earlier in this video.
The main way GTO helps you organize your process evaluation data is through the Process Evaluation Results Summary Tool. Let's take a look at the tool with some sample data.
The tool has seven elements, each in a different row. This is just the first row. Rows 2–7 are on the next slide. Here, the focus is on demographic information about the participants in your prevention activity. Having this information can be important because it can help you understand how well you're reaching the audience you were intending to reach. In this example, all the participants in Joint Base Hensonburg's implementation of Green Dot were junior enlisted. If they had wanted to reach a broader audience, then this data shows they missed the mark. If they wanted to hone in on just junior enlisted, then the data is saying they are on track.
Next, we focus on number two in the tool, which asked you to look at how much did participants actually participate? In Step 6, you would have planned a way to collect this information, perhaps an attendance sheet. This example shows different ways to look at the data, including individual prevention activity sessions and an overall average. This data shows that participation was excellent. Some prevention activities, like a policy change, might not lend themselves to collecting participation data in this way, however.
In Row 3, you're asked to look at data on how well the prevention activity was delivered. For a typical prevention training, that could involve having outside observers watch the sessions to assess how well it was delivered, as shown in the example. This example shows that it took some time for the person delivering Green Dot to get comfortable.
In this row, the tool focuses on satisfaction of the participants. While not an indication of the quality of the prevention activity, having satisfied participants can be important in order to keep people engaged. In this example, most felt satisfied, although only about half felt the program was important and would have an impact. This data could provide clues about improving the program later on.
Row 5 focuses on what staff perceive as having occurred, usually collected through interviews or a discussion group with those involved. Staff often have important insight. In this example, they were starting to wonder if leaders of the base were somehow communicating that the program was not important, a data point that syncs with what we saw in the satisfaction data. This is an example of combining different types of data to tell a complete story.
Row 6 focuses on how well the implementation followed the work plan. By simply going back to the plan and reviewing all the task for their timeliness and quality, you can learn a lot about how well implementation went and what can be improved the next time. You can also check to see if implementation targets—such as the number of people participating, the amount of participation–were met. In this example, they were met. Pause the video here and take a look at Joint Base Hensonburg's filled-out tool for its bystander intervention program called Green Dot.
Now that we have completed GTO Step 7, let's move to Step 8, outcome evaluation. Here, we'll cover what you learned from your outcome evaluation about whether your activity has achieved its desired outcome. In Step 8, we're still in the same area of the Prevention Plan of Action focused on continuous evaluation.
To answer the question in Step 8, we use the tools from Steps 1–7. The Outcome Evaluation Planner Tool from Step 6 is your evaluation roadmap, and the Outcome Evaluation Results Tool in Step 8 will be a repository for all the information or results you obtain from your outcome evaluation. Depending on the kind of prevention activity you implemented, you would typically assess knowledge, skill, and perhaps behaviors before they participate in the prevention activity and then assess those things at the end and compare the results to see if they change in any way. You probably will have set up certain benchmarks for each outcome that you wanted to achieve with a certain measure that was taken at the pre- and at the post-implementation. And so, those data would be your outcome evaluation results. Longer-term outcomes or goals for your entire installation that rely on installation-level data like the DEOCS survey can also be recorded on this tool, but it would not be expected in the same time frame as an outcome evaluation for its specific activity's implementation. This is a good place to pause the training video and check out the Step 8 Outcome Evaluation Results Tool in the GTO guide.
This kind of evaluation is important because it shows what differences the prevention activity made, if at all. In other words, it shows whether the prevention activity was effective at reaching your desired outcomes. Combined with the results of your process evaluation—GTO Step 7—this step will begin identifying areas for improvement to help address any missed outcomes in an effort to improve the activity. Outcome evaluation results can help you demonstrate the effectiveness of your activity and make the case for continuing to military leaders and other stakeholders. However, sharing results in simple, meaningful ways can have other useful impacts as well. For example, reporting positive results to superiors can build support to keep the selected activity running. Keep in mind that different groups of stakeholders may be interested in different types of information. A commander may simply want the bottom line up front. Members of the installation prevention team may want more details. In Tip 8.1 in the guide, we have included some different ways that information might be reported for different audiences.
Let's be clear about specifically what an outcome evaluation is and is not. Outcome evaluation can answer questions such as, "did the participants in the activity change on the desired outcomes, such as knowledge, attitudes, skills, and behaviors?" Outcome evaluation is not whether participants liked the activity or how many trainings were held. Each of your prevention activities should have an outcome evaluation plan from GTO Step 6. Outcome evaluation for each activity should be planned before the activity begins and should have specific time points for data collection, such as before and after the activity has gone through a complete cycle. This step is called outcome evaluation because the collected data on outcomes track the desired outcomes of the activity which you established in GTO Step 2 and documented on your Logic Model Tool.
By the time you've come to this step, you should have been executing your outcome evaluation data collection plan you established in GTO Step 6. Now that you have gathered your data, you need to analyze it. It may be worthwhile to consult an expert in data analysis procedures to ensure you're using appropriate techniques. For example, when using quantitative data collection methods like surveys, it's common to compare averages and frequencies. You may be comparing your results on some indicator to an established benchmark you adopted in GTO Step 2. An evaluation expert or the prevention activity's developer may be able to tell you what values are expected from the activity so that you can address whether the activity is having the intended effect. Use the Prevention Activity Outcome Evaluation Results Summary Tool to help you analyze and summarize quantitative data.
Then you'll need to interpret the results. Whatever the outcomes, you will need information from both GTO Step 7 (process evaluation) and 8 (outcome evaluation) to tell you what is happening with your activity. As mentioned previously, that's because in order to reach desired outcomes, the activity needs to be both implemented well (assessed by GTO Step 7) and be based on good evidence (assessed by GTO Step 3).
Finally, another consideration in interpreting outcome data is reconciling the conclusions from tracking both short-term and long-term outcomes. The evaluation of short-term outcomes may show that the activity was successful. For example, service members improved their knowledge of high-risk situations that may lead to sexual assault. However, it is possible that tracking long-term outcomes—actual incidents of sexual assault on or near the installation—shows the long-term outcome is unchanged. Maybe not enough service members were exposed to the prevention activity to improve the long-term outcome. Or maybe simply improving knowledge does not translate into actual behavior. Long-term outcomes are more difficult to improve than short-term. The conclusions that you come to using the data that you collect will help you develop a plan for continuous quality improvement, discussed in more detail in GTO Step 9. Make sure you consult Tip 8.1 in the guide as you consider different audiences for the results.
This step has one main tool: the Outcome Evaluation Results Summary Tool. This tool organizes your outcome evaluation results. We will take a look at an example in a minute. The guide has two tips that could be useful. Tip 8.1 has information about how to present outcome evaluation data to different audiences. For example, leadership will want more of a high-level briefing about results compared to, say, other service members. Tip 8.2 discusses how to do what is called a retrospective pre-post evaluation. This is where you do not collect data before the prevention activity, only after. But you ask participants two sets of questions. One, ask participants to remember back to before the prevention activity. And the other, ask them to answer as they are now. Then you compare the two sets of answers to assess how much they changed. This has certain advantages because you only have to collect data one time—at the end.
Here's an example of the Prevention Activity Outcome Evaluation Results Summary Tool completed for data collected on Joint Base Hensonburg from the Green Dot program. Remember, Green Dot is a bystander program, so you can see here that all the questions that were asked of the participants are about different aspects of bystander behavior. In this example, there were survey questions on intent, efficacy, and actual bystander behavior. While asking about behavior is critical, good evaluation practice is also to measure areas that lead to behaviors—like how much you intend to do something or what your efficacy is. This way, if you do not get the behavior outcomes you want, these other measures can give you clues. For example, maybe there was low efficacy. The tool instructs you to enter pre-scores, post-scores, and calculate a percent change between the two. The formula for the percent change is in the tool. It's helpful to try to keep the pre- and post-scores on the same row so it's clear if there is any change. Finally, you add your overall interpretation of the data. In this example, there was a strong positive change on efficacy, but little change on intentions or actual behavior. It's possible that the participants were simply not convinced that this was important and were not intending to intervene.
As you saw in Step 8, percent change is one way to evaluate change from pre to post. The Data Snapshot Tool is another way. It is available on the RAND website where you get the full GTO guide, shown here in the yellow box. You click on the green zip file button, and that will download some files to your computer, including the tool itself, which is an Excel spreadsheet; the tool with fake data loaded in as an example; and a Word file with step-by-step instructions. There is a detailed guidance available, but briefly, the tool has a number of evaluation measures already loaded in. You would pick the ones relevant for your prevention activity and then use the Excel form to enter pre-data and post-data.
After you are done entering the pre- and post-data, the Excel file will automatically generate a chart for each measure with data. Here is an example of a workplace incivility scale. The top of each bar is the average for the pre (the blue bar) and post (the orange bar). The black vertical lines are called confidence intervals—essentially, the possible range for the average once you take into account all the possible error. If the two confidence intervals do not overlap, like here, then you can be confident that there was a real change. In this example, the amount of workplace incivility went down.
This is a lot to cover here, so when you download the files, one of those files is a step-by-step instruction guide that provides detailed instructions. Go ahead and pause here and take a look at Joint Base Hensonburg's results reported in the Step 8 tool.
Okay. Now that we have our Steps 7 and 8 complete, Step 9—or continuous quality improvement—is about pulling together evaluation data and all the GTO steps to tell us something about the activity's effectiveness. What is continuous quality improvement, or CQI? It actually started in industry—like automaking—and has become very popular now in social services. It means a systematic effort to review what has happened, try something to get better results, and see how it goes. It is like a mentality or a habit, and that's why it is continuous. Quality means effectiveness or success, and improvement means trying to do better. Again, we are still in the continuous evaluation part of the Prevention Plan of Action.
Step 9 is important because CQI takes advantage of what you've learned over time from your process and outcome evaluations to improve the prevention activity for the future without starting over from the beginning. It puts the investment made in evaluation to work by using results to make changes and understand their effects as you continue to implement your activity. It helps all staff involved to keep your activity fresh and a good fit for your participants, your organization, and your community.
As shown here, GTO Step 9 will help you use your process and outcome evaluation data to determine what worked well, where there is room for improvement, and what changes may be needed the next time you run the prevention activity. You can see here that CQI is a process for deciding what change can be made that will result in improvement. This cycle can be found in other military performance assessment tools and throughout industry and health and social services.
A CQI review tool will prompt you to summarize your evaluation data and work back through GTO Steps 1–8 as you assess what went well and what should be improved. You will decide whether you met the goal and desired outcomes using results from your process and outcome evaluations. This will prepare you to decide whether and how to revise your goals and desired outcomes, reassess fit and capacity, and revise your work plan for the future.
GTO Step 9 has one tool, the CQI Review Tool. Also, the guide contains information for preparing a third leadership briefing to share evaluation results and next steps for prevention with your leadership.
Let's walk through an example of a completed CQI tool that builds on the prior two examples, bringing together their process and outcome evaluation data. The tool has three parts. This is part one. Here, the tool instructs that you reprint the relevant target problem and desired outcomes from earlier steps to make sure your data is aligned with these elements. Then, reviewing the data, you were to record whether you reached, missed, or exceeded the desired outcome. In this example, there was not a 10% increase in bystander behavior, so this desired outcome was missed. Then, based on how you did on the desired outcome, you were to develop a brief plan of action. If you reached the desired outcome, then the plan would be simply to continue as before. Here you can see that changes are needed because the desired outcome was missed. You would repeat part one for each desired outcome you had.
In part two, you organize your process evaluation results—including the target population, demographic data, participation, and interpretation. We have looked at this data before, but it is helpful to have it organized all in one place.
This part of part two addresses how representative your evaluation results are of the larger target population.
The third part of the CQI tool is the review of past GTO steps. The idea is that, by assessing past steps, you could find both clues that could help you explain your results and identify areas that need to improve. The leftmost column poses the question to answer step by step. The middle column is where you record your answer, and the right-hand column asks for a brief action plan to make improvements the next time the prevention activity is attempted. Here you can see that when revisiting GTO Step 2, the base realized that both their data and perceptions of their team indicated that the mid-level leadership might not have been supportive enough and have ideas for how to improve that in the future.
The tool continues with the other steps, including a brief review of the process and outcome evaluation data from GTO Steps 7 and 8. Here you can see that multiple steps point to the need to engage mid-level leadership, and the base makes plans to do that. Here is a good place to pause the video to take a closer look at the complete example of the CQI Review Tool shown in the GTO guide.
Step 10 is the last GTO step, and it raises several questions. If my prevention activity is successful, how can we keep it going? How can we pay for the prevention activity after the initial funding runs out? What else do we need to keep it going? We are still in the quality implementation/continuous evaluation part of the Prevention Plan of Action.
You can answer these questions with the Sustainability Review Tool, which lays out a series of tasks that have been proven to further sustainability. Things like writing a prevention activity into people's jobs descriptions, designating someone to run the activity, and having a defined line item for a budget that funds the activity are all important factors in sustainability. So, the tool prompts you to think about these issues and make a plan for sustaining an effective activity. Not only that, but it provides documentation you can share with others—for example, with leadership or colleagues—using the same activity. This is the final step in the GTO process, though as you know, it's a cycle that is continuous as long as the activity is in place. Pause here and check out the tool in the GTO guide step 10.
Sustaining a prevention activity is important for lots of reasons. First, if the original problem still exists and your activity shows that it achieves outcomes, then there is still a need for your activity. By sustaining the activity, your installation and its service members will continue to get the benefits from the large investment in starting the activity. Sustaining effective prevention activities maintains the positive feelings your successful activity generated among installation leaders, service members, and funders, and as to your reputation for delivering quality, evidence-based prevention. Finally, sustainment is a core function well known in the military.
Sustainability involves a deliberate effort to integrate the core elements of the prevention activity into the routine of your installation. Two important questions to consider during this step: What is working that should be sustained? And how do we sustain activities that work?
Looking across research on sustainability, there are eight key elements that make sustainability more likely, which then point to specific actions you should take at your setting. First, the more diverse the funding sources for an activity are, the more likely it will be sustained, because it will not be dependent on just one source. This could also apply to leader support. The more leaders that support the activity, financially or otherwise, the better. Other factors that promote sustainability is training more people, rather than just one; keep up the capacity to deliver the activity as intended; and integrating the activity in existing structures. For example, having the activity written into job descriptions, planning documents, and budgets will make it more likely that the activity will be sustained. Other elements are simply demonstrating that the activity has value. That's why evaluation is critical. Cultivating champions or influential personnel up and down the chain of command who support the activity. Ensuring the activity has oversight, and ensuring there are ample documentation for the activity to enable its continuation if the original implementers leave.
Like GTO Step 9, GTO Step 10 involves a global or comprehensive review of number one, what you have done to date, and number two, what you will do in the future to promote sustainability. You will also rely heavily on many of the tools from Steps 1–9 to guide your discussions about sustainability efforts. Using the tool, you'll describe what you have done to date and how you want to do things in the future.
After you've gone through the evaluation and quality improvement steps, you may determine that your prevention activity isn't performing well. If it was well implemented, but the outcomes weren't achieved, the program may not be effective in your setting, so it may be appropriate to discontinue the activity. While this can be disappointing and disruptive, discontinuing an activity based on sound data analysis can be the best decision. A key point is that it will be important to inform your stakeholders about how you reach your decision. If leaders understand it was a data-driven decision, they will likely be supportive. After discontinuing an activity, it will be important to restart the GTO process to choose another prevention activity that can address the problems and achieve the goals and desired outcomes you originally set.
Another way to think about sustainability is through the lens of the ten GTO steps themselves. That's because each action you take across the ten steps contributes to the overall sustainability of your activity. In Step 1, making sure the activity meets the needs. In Step 2, making sure the goals and outcomes are valued by stakeholders. Across steps 3–5, making sure the activity works, is a good fit, and has the necessary capacities. And finally, making sure the activity has a good plan and documented enough that others can take it over if there's turnover.
Continuing on through the GTO steps, evaluation and quality improvement are also key factors in sustainability. And finally, it's important to have an actual sustainability plan in place. It will probably not surprise you that GTO has a tool to help you plan for sustainability.
Along with Tip 10.1 in the guide, GTO has the Sustainability Review Tool. Here's an example of how Joint Base Hensonburg filled out their Sustainability Review Tool. The tool has two parts. The first part, shown here, prompts personnel to document the need for the prevention activity, the strength of the results, and what justification there is for the activity going forward—all based on most recent implementation of the activity. There is space for direct answers and next steps for each prompt. Continuing on, our examination of the current status include questions about what needs to change about how we do the activity and who will support it.
In part two of the tool, there are prompts that ask about where materials will be stored, who will be running the activity, who the champions will be, and who will run the evaluation in the future. These prompts should look familiar, as they align with some of the best sustainability practices we reviewed earlier. Also in part two of the tool are questions about continued funding, timing for the next implementation, and maintaining training levels. An example of Joint Base Hensonburg's completed Sustainability Review Tool can be found in the GTO guide.
Congratulations. You have progressed through all ten GTO steps. Using this guidance and materials, you should be well on your way to developing strong prevention activities that can keep DoD service members and civilians safe from sexual assault and harassment.
Just a reminder: We have several GTO guidance documents that DoD personnel can download at any time to prevent sexual assault and harassment, as shown in the GTO onboarding roadmap. Intended for new personnel who might not be familiar with GTO, this roadmap is also a handy guide to remind personnel where all the GTO materials are. The GTO handbook, which is a shortened and online guide that's a good introduction to GTO, and the full GTO guide. Both not only have PDFs of guidance, but also Word versions of the tools and other support documents.
Finally, this is the link to another GTO product developed for Air Force, specifically on sexual harassment. Again, although developed for Air Force, it could easily be used by any service branch.
If you have any questions, feel free to reach out to me or my colleague Joie Acosta at the contact information shown in this video. Thank you for accessing this video, the last of three videos about using Getting To Outcomes to strengthen sexual assault prevention.Back to top ⤴