This article describes our assessment of SimCoach, a computer program featuring a virtual human who speaks and gestures in a video game–like interface, designed to encourage service members, especially those with signs or symptoms of posttraumatic stress disorder (PTSD) or depression, to seek help to improve their psychological health.
Background
As of December 31, 2012, more than 2.5 million service members had served in Operation Enduring Freedom (OEF) or Operation Iraqi Freedom (OIF) (Defense Manpower Data Center, 2013). Estimates suggest that approximately 18.5 percent of combat troops returning from these conflicts meet structured survey criteria for PTSD or depression (Tanielian and Jaycox, 2008). Many also experienced traumatic brain injuries (TBIs) from blasts associated with improvised explosive devices, the signature injury of the OEF and OIF conflicts. TBIs and psychological health problems often occur within the same person and can complicate the course of these conditions (Bryan and Clemans, 2013; Barnes, Walter, and Chard, 2012; Hoge, McGurk, et al., 2008).
Despite several initiatives to address their psychological health needs, service members often experience barriers to treatment, including concerns about stigma or threats to career advancement (Hoge, Castro, et al., 2004; Pietrzak et al., 2009; Vogt, 2011; Tanielian and Jaycox, 2008). Innovative solutions are needed to increase rates at which service members and their families appropriately access and use mental health care for psychological health conditions and TBI.
The Defense Centers of Excellence for Psychological Health and Traumatic Brain Injury (DCoE) has funded several such programs, including SimCoach. SimCoach is a virtual-reality (VR) platform that allows users to interact anonymously with virtual humans in online environments; as such, they can use the platform to access helpful information about psychological health and TBI services without privacy concerns. An online, VR intervention may be particularly appealing and efficacious for service members because (1) many people turn first to the Internet to answer psychological health questions, (2) the novelty of the intervention may render it engaging, and (3) it allows for anonymity of users in a context in which they may be hesitant to be identified. However, little empirical research has tested these hypotheses about online VR interventions for service members seeking help for psychological health concerns directly, and whether SimCoach itself is acceptable and effective for promoting help-seeking is unknown.
In this article, we describe our assessment of the first public release of SimCoach (SimCoach Beta). The aim of this evaluation was to document and assess SimCoach development procedures and then test and report the program's efficacy for promoting help-seeking for signs and symptoms of PTSD and depression among service members. The evaluation design included two components. First, we conducted a formative evaluation to describe SimCoach's design and content relative to established best practices in software development and clinical intervention development. Second, we conducted a summative evaluation in which we used a randomized controlled trial (RCT) to test SimCoach Beta's efficacy in increasing participants' intention to seek treatment relative to that of controls. This article summarizes the results of the formative and summative evaluation efforts and makes recommendations to SimCoach developers and developers of similar interventions, as well as policymakers who invest in the development of VR interventions for service members' psychological health.
Evaluation Methods
Formative Evaluation
In the formative evaluation, we collected data in interviews with SimCoach developers, published and supporting materials provided by the SimCoach team, and direct assessment of the SimCoach Beta interface. These data were used to compare the development process with best practices for software engineering and development of behavioral interventions. We reviewed the SimCoach Beta interface for content and features, assessing consistency with established evidence and best practices for conducting surveys among participants at risk for distress.
Summative Evaluation
Study participants interacted with the SimCoach Beta program's virtual human. Interactions with the virtual human included conversational dialogue prior to the administration of one of two adapted instruments for assessing psychological health—the PTSD Checklist (PCL) and the nine-item Patient Health Questionnaire (PHQ-9)—followed by personalized recommendations. Participants for the RCT were recruited online from Google ads and websites and email lists targeting service members. The RCT compared help-seeking outcomes across three study arms: (1) the SimCoach arm, in which respondents were administered questionnaires of outcome measures after interacting with the SimCoach Beta tool, (2) a content-matched control arm identical to the SimCoach arm but substituting the virtual human interactions with conventional online text-based methods, and (3) a no-treatment control arm in which participants completed the outcome assessments without any additional assessments or recommendations. The primary outcome measure was the intention to seek help for PTSD or depression, with secondary outcomes related to perceived barriers to seeking and accessing care.
Evaluation Findings
Formative Evaluation
The formative evaluation showed that SimCoach Beta (the version included in the summative evaluation RCT) was reliable and that the technical development approach was consistent with software development best practices, with a scalable architecture and iterative development strategy incorporating multiple rounds of user feedback (Scott et al., 2011).
Our assessment of SimCoach Beta's clinical intervention content was aligned with best practices insofar as the developers established a panel of reputable domain experts from the earliest stages to inform development. However, our review of SimCoach Beta content indicated that the personalized recommendations that the program offered were less well-developed. For instance, we identified cases in which the program directed users to web pages that did not mention treatment or help-seeking. In reviewing program responses to text entries indicating user distress, we also found that the list of phrases that would trigger directions to seek help was limited such that some user expressions of distress might not trigger an appropriate referral to help or emergency services. Other technical limitations of the program were that it employed adapted, “conversational” versions of standardized screening measures for depression and PTSD that have not been fully validated and that many users responding to online advertisements during pilot-testing were attempting to access SimCoach Beta from mobile platforms that the program did not support.
The main finding from the formative evaluation is that SimCoach Beta testing had a much greater focus on user experiences than on the outcome of interest for our study—the program's efficacy in improving users' intentions to seek help. This focus on user experience was reflected in our findings from the summative evaluation.
Summative Evaluation
Our summative evaluation assessed help-seeking outcomes, perceptions of barriers to help-seeking, and user experience among SimCoach Beta users compared with experiences of controls who received a text-based version of the SimCoach Beta intervention (content-matched but no interaction with a virtual-human element) and with a no-treatment control consisting of outcome assessments only. Of the 1,362 users who accessed the initial screener, 557 were found eligible and randomized to one of the three arms. Of those who were randomized, 280 completed the full trial and survey. We did not detect a significant effect of the SimCoach Beta intervention on help-seeking intentions compared with participants receiving no intervention.
Recommendations
Our evaluation was based on one version, the first public release (SimCoach Beta), which was intended to be improved upon and extended. As a result, we provide two sets of recommendations: one for SimCoach developers and one for DCoE, which funded SimCoach development.
We recommend that SimCoach developers do the following:
- Implement best practices for the development of help-seeking interventions.
- Given user reluctance to interact with a virtual human, consider new approaches to SimCoach marketing to promote the value of the approach.
- Use validated screening instruments when possible to ensure that these instruments have sufficient reliability, sensitivity, and specificity.
- Consider using an outcome-oriented, iterative development process during subsequent improvements to the SimCoach intervention.
- Continue to design new dialogue and content to meet SimCoach goals of reaching a target audience of service members, veterans, and family members.
- If future versions are found to be effective, develop versions of SimCoach that are compatible with mobile devices and web browsers.
- Consider using SimCoach in other cases in which potentially sensitive questionnaires and information may be delivered.
Our results do not show any conclusive evidence of the efficacy of SimCoach Beta. In that light, we offer the following recommendations to guide further decisions about DCoE's involvement with SimCoach and similar programs:
- Consider changing funding models to motivate best practices in intervention development. For example, DCoE could consider requiring submission of pilot data prior to funding larger technology-development projects or requiring the use of validated instruments.
- Support pilot evaluations and dissemination approaches in different contexts.
- Consider investing in strategies to guide the development of technology-based clinical interventions.
- DCoE might play an active role in design and monitoring of outcome-oriented progress metrics for technology-development projects.
Conclusions
Technology-driven behavioral interventions, such as SimCoach, are being widely and rapidly developed and disseminated, yet there is no established set of best practices that marries technology development and the development of interventions to improve psychological health. Although SimCoach Beta software development was consistent with DoD best practices (Scott et al., 2011) and the results of the study suggest that users had a satisfactory experience while using SimCoach Beta, participants in the RCT did not show greater intentions to seek help than users who did not receive any questionnaires or recommendations. It is possible that an outcome-oriented approach to developing software for behavioral change over a user experience–oriented approach might be preferable in this particular domain of interventions. Stakeholders in SimCoach and other technology-driven behavioral interventions might consider coordinating a consensus process for creating best practices and principles for future reference.
References
Barnes, Sean M., Kristen H. Walter, and Kathleen M. Chard, “Does a History of Mild Traumatic Brain Injury Increase Suicide Risk in Veterans with PTSD?” Rehabilitation Psychology, Vol. 57, No. 1, February 2012, pp. 18–26.
Bryan, Craig J., and Tracy A. Clemans, “Repetitive Traumatic Brain Injury, Psychological Symptoms, and Suicide Risk in a Clinical Sample of Deployed Military Personnel,” JAMA Psychiatry, Vol. 70, No. 7, 2013, pp. 686–691.
Hoge, Charles W., Carl A. Castro, Stephen C. Messer, Dennis McGurk, Dave I. Cotting, and Robert L. Koffman, “Combat Duty in Iraq and Afghanistan, Mental Health Problems, and Barriers to Care,” New England Journal of Medicine, Vol. 351, No. 1, July 1, 2004, pp. 13–22.
Hoge, Charles W., Dennis McGurk, Jeffrey L. Thomas, Anthony L. Cox, Charles C. Engel, and Carl A. Castro, “Mild Traumatic Brain Injury in U.S. Soldiers Returning from Iraq,” New England Journal of Medicine, Vol. 358, No. 5, January 31, 2008, pp. 453–463.
Kroenke, Kurt, Robert L. Spitzer, and Janet B. Williams, “The PHQ-9: Validity of a Brief Depression Severity Measure,” Journal of General Internal Medicine, Vol. 16, No. 9, September 2001, pp. 606–613.
Lang, Ariel J., K. Wilkins, P. P. Roy-Byrne, D. Golinelli, D. Chavira, C. Sherbourne, R. D. Rose, A. Bystritsky, G. Sullivan, M. G. Craske, and M. B. Stein, “Abbreviated PTSD Checklist (PCL) as a Guide to Clinical Response,” General Hospital Psychiatry, Vol. 34, No. 4, July–August 2012, pp. 332–338.
Pietrzak, Robert H., Douglas C. Johnson, Marc B. Goldstein, James C. Malley, and Steven M. Southwick, “Perceived Stigma and Barriers to Mental Health Care Utilization Among OEF-OIF Veterans,” Psychiatric Services, Vol. 60, No. 8, August 2009, pp. 1118–1122.
Scott, John M., David A. Wheeler, Mark Lucas, and J. C. Herz, Open Technology Development (ODT): Lessons Learned and Best Practices for Military Software, Washington, D.C.: U.S. Department of Defense, May 16, 2011. As of August 6, 2014:
http://dodcio.defense.gov/Portals/0/Documents/FOSS/OTD-lessons-learned-military-signed.pdf
Tanielian, Terri, and Lisa H. Jaycox, eds., Invisible Wounds of War: Psychological and Cognitive Injuries, Their Consequences, and Services to Assist Recovery, Santa Monica, Calif.: RAND Corporation, MG-720-CCF, 2008. As of August 6, 2014:
http://www.rand.org/pubs/monographs/MG720.html
Vogt, Dawne, “Mental Health—Related Beliefs as a Barrier to Service Use for Military Personnel and Veterans: A Review,” Psychiatric Services, Vol. 62, No. 2, February 2011, pp. 135–142.
This research was sponsored by DCoE and conducted within the Forces and Resources Policy Center of the RAND National Defense Research Institute, a federally funded research and development center sponsored by the Office of the Secretary of Defense, the Joint Staff, the Unified Combatant Commands, the Navy, the Marine Corps, the defense agencies, and the defense Intelligence Community.