Applying the TIP Tool: A Case Example

A STRONG STAR psychotherapy training, focused on prolonged exposure (PE) for PTSD.

Pilot testing of the TIP Tool suggests that the tool is useful in supporting psychotherapy training leaders in assessing how their training meets their goals. The STRONG STAR Training Initiative, one of our partners in pilot testing, shared how they used the TIP Tool to reflect on their training in Prolonged Exposure (PE) for posttraumatic stress disorder (PTSD). This case study provides an overview of how the TIP Tool may be used by an organization, including a four-step process for interpreting and using the results.

STRONG STAR Training Initiative Logo

STRONG STAR’s Training in Prolonged Exposure

The STRONG STAR Training Initiative is committed to training community mental health providers who serve veteran populations in evidence-based treatments for PTSD and related mental health problems. One of the STRONG STAR psychotherapy trainings focuses on prolonged exposure (PE) for PTSD. PE is a time-limited, cognitive behavioral therapy that focuses on supporting the patient in gradually and systematically approaching and discussing trauma-related content in order to reduce PTSD symptoms. PE is a recommended treatment for PTSD according to clinical practice guidelines from the U.S. Department of Veterans Affairs/Department of Defense and the American Psychological Association.

STRONG STAR conducts its trainings using a learning community model, which includes:

  • pre-training readings and webinars,
  • in-person training,
  • group telephone case consultations, and
  • an online portal with relevant resources.

The PE training is delivered in-person as a two-day didactic training workshop. Reading materials are provided in advance. The didactic training includes the theoretical basis for the intervention and core techniques. The training includes discussion of case examples, interactive activities, and structured role plays. Clinicians receive resources to assist with the implementation of PE following the training, including session checklists. After the training, clinicians can receive six to twelve months of weekly telephone case consultation, monthly webinars, and organizational consultation.

Applying the Tool

RAND researchers applied the TIP Tool to the Training Initiative’s PE training. To apply the tool, researchers reviewed relevant materials, including clinician applications, a pre-training checklist, workshop agenda, workshop slides, handouts, and course evaluations. RAND researchers also held discussions with a PE consultant, a PE trainer and consultant, and the STRONG STAR Training Initiative Program Director and project coordinators. After gathering relevant information, RAND researchers completed the PDF version of the Tool and entered the scores for each item into the TIP Interactive Tool to generate the TIP Profile. We discussed the TIP Profile with the STRONG STAR Training Initiative Program Director and Director of Training. This discussion led to several ideas for potential changes to the training. Following up with the Training Initiative several months later, we learned which changes were implemented and reassessed the training.

Step 1: Review the TIP Profile

Applying the TIP Tool allows us to think about which domains could use improvements. This is particularly useful because it makes the process feel more manageable.

STRONG STAR Training Initiative Leadership

The TIP Profile provides an overall score, along with scores on each of the five TIP Tool domains. Scores fall into one of three categories: Raising Awareness (1.00-2.33), Skill-Building (2.34-3.66), and Supporting Competence (3.67-5.00). The overall score for the training was 3.65, which is at the top of the Skill Building range. Of the five domain scores, the training received three scores that fell into the “Supporting Competence” range, including the domains of Didactic Training Format (4.50), Didactic Training Content (4.25), and Consultation/Supervision (4.00). The domain score for Implementation Facilitation (3.65) was at the top of the “Skill Building” range (2.34-3.66), consistent with the overall training score. The domain score for Program Evaluation (2.00) fell into the range for “Raising Awareness” (1.00-2.33).

Step 2: Determine Whether Scores are Consistent with the Goal of the Training

In reviewing the TIP Profile with the Training Initiative’s leadership, RAND researchers learned that the program aimed to support clinicians in achieving competence in PE. “We want to be as close to competency [as possible], but scalable,” the Training Initiative’s leadership reported. The TIP Profile indicated that several aspects of the training were consistent with this goal, but also suggested there could be aspects of the training that could be changed to better support this goal. The overall score on the TIP Tool was at the top end of the Skill Building range. This suggests that even small changes to the training could help to ensure that this training was Supporting Competence.

Step 3: Use the TIP Profile to Identify Targets for Changes in the Training

Discussion of the TIP Profile led to several ideas for potential changes to the training. Some suggested changes were potentially less resource-intensive. For example, regarding Didactic Training Format, conducting role plays in groups of three would allow one person to observe and assess skill performance using a structured coding tool (Item IC). They also considered ways to more formally engage participants in a discussion of implementation obstacles (Item VA), noting, “there are great questions about the [implementation] barriers…in the [clinician] application, but those are not used elsewhere in the training.” They suggested creating "a rubric where [clinicians] receive feedback on their pre-work to help work through barriers,” which could then be turned into an implementation plan.

Other modifications, however, had the potential to require a more significant change to the training or resources required to deliver the training. For example, the Training Initiative considered the feasibility of adding a knowledge test (Item IVB) or skills test (Item IVC) at the end of the in-person didactic training. However, they also discussed whether a change in skills would be expected as a result of the two-day didactic component alone. They also discussed potential changes to the consultation component of the training, such as incorporating review of audio recordings of sessions (Item IIIB). However, they suggested that it could be challenging and resource-intensive for clinicians to submit audio recordings of their treatment sessions, noting the Training Initiative’s commitment to keeping the program scalable and responsive to the needs of community-based clinicians in a variety of different practice settings.

Step 4: Implement Changes and Reassess the Training

During a follow-up discussion with RAND researchers, the Training Initiative leadership described changes implemented to the PE training. Based on these changes, RAND researchers re-scored the PE training using the TIP Tool. The overall score for the PE training increased to 3.94, which falls into the Supporting Competence range. Based on these changes, the PE training is now more consistent with the Training Initiative’s goal in helping community-based clinicians reach adherence and competence in providing PE.

The Training Initiative made two changes to the training, both of which related to the Program Evaluation domain. First, they developed a more systematic mechanism for sharing the results of course evaluations with trainers by creating a summary report that includes mean scores on evaluation items. This increased their score on the course evaluation item (Item IVA) from 4 to 5. They also described ways they have adjusted their trainings based on feedback, such as launching a message board as part of their provider resource portals. Second, they developed knowledge test to be administered before and after the training. This increased their score on the knowledge test item (Item IVB) from 1 to 5. Based on these changes, the score on the Program Evaluation domain increased to 3.67. This score falls into the Supporting Competence range.

STRONG STAR: Prolonged Exposure for PTSD

STRONG STAR TIP Profile Graph

The STRONG STAR Training Initiative has continued to consider ways to increase content regarding implementation facilitation into their trainings. For example, as part of trainings they are conducting for a specific set of agencies, they are facilitating biweekly implementation meetings. Though this type of implementation facilitation support would be time and resource intensive to incorporate as a standard component of all trainings, they are also piloting the development of implementation profiles that reflect the status of trainees’ agencies. These can then be used to inform the content of the trainings. However, these efforts are still in the early stages and have not been formally incorporated into the PE training.

After describing completed and planned changes, the Training Initiative reflected on the value of the TIP Tool. They noted that it has been useful for assessing whether their trainings are achieving the goal of supporting clinician competence: “There are lots of different ways this tool is helpful for people to understand the difference between attending a webinar and what is needed to learn a technical skill.”