Artificial intelligence technology combats suicide in veterans
Mobile and social networking technology monitors big data from messages to detect suicide risk in military veterans
By Chris Poulin and Gregory Peterson Posted on 11 November 2015
Chris Poulin is a contributing author to the book Artificial Intelligence in Behavioral and Mental Health Care, recently published by Elsevier. He was Director and Principal Investigator of the Durkheim Project, a nonpofit big-data collaboration that developed the technology he describes in the book and this story. Gregory Peterson was a board member of the Durkheim Project.
As Veterans Day is observed throughout the United States, the country faces a persistent crisis in veterans committing suicide – an average of 22 veterans take their own lives every day. Concern over the rise in veteran suicides has brought renewed attention from military personnel, elected officials, concerned families and medical professionals.
Among the key questions is “how can we improve options for monitoring at-risk veterans and effectively facilitate suicide interventions?”
A new technology that uses big data to predict suicide risk in real time could help medical professionals and social workers intervene and prevent suicide. This artificial intelligence technology is featured in the book Artificial Intelligence in Behavioral and Mental Health Care, recently published by Elsevier.
A frustrating and tragic problem
US Army records on suicide rates only go back to the early 1980s, so no one has historical data to provide a long-term perspective on this tragic problem. But we do know that America’s military leaders have long recognized the dire threat posed by suicide — and the inability to address this formidable enemy using traditional methods.
In 2012 former Secretary of Defense Leon Panetta called military suicide “perhaps the most frustrating challenge that I've come across since becoming Secretary of Defense.” And when the Armed Services Committee’s Subcommittee on Military Personnel held hearings in October 2015, testimonies from America’s military leaders confirmed that the nation’s suicide problem is persistent.
If these hearings are a fair indicator, America's military leaders are recognizing the limitations of current medical approaches to suicide prevention. The military narrative also seems to be acknowledging a need to supplement traditional mental health treatment with innovative, if not well defined, approaches.
In a recent Congressional testimony, Dr. Keita Franklin, Director of the Defense Suicide Prevention Office (DSPO), said:
Strong data and surveillance methodologies help us identify our most at‐risk populations.
And Lt. General James C. McConville of the US Army said:
We are committed to reviewing the ‘how’ and ‘why’ from every case to learn from it.
Both of these quotes are, in fact, accurate descriptions of how technology should be used in the fight against suicide. Currently, traditional ‘off-line’ approaches are favored, such as the Columbia-Suicide Severity Rating Scale, an assessment used to determine suicide risk.
Unfortunately, there’s a wide gap between current military practices and the potential of innovative approaches that could be used to protect those who are most at-risk. America’s Department of Defense (DoD) needs a more innovative and effective approach.
A new approach to an old problem
The suicide crisis in veterans requires the marriage of traditional medicine with emerging technologies that have proved efficacious, both in suicide intervention and for mental health issues more generally. In stark contrast to the standalone, traditional medical approaches are the successful R&D efforts in opt‐in mass surveillance technology. The noteworthy example – one that has led to the new suicide risk predicting technology – is “The Durkheim Project,” sponsored by the Defense Advanced Research Projects Agency (DARPA).
The Durkheim Project analyzes unstructured linguistic data from social and mobile sources to predict mental health risk. This is facilitated by a predictive analytics engine (Predictus). The predictive engine is integrated with the latest big data technologies (e.g., Hadoop and other open-source technology). This combination facilitates a clinical dashboard so timely interventions can be taken to save those at risk.
The Durkheim Project was a nonprofit research effort that ran from late 2011 to early 2015, focusing on using big data to inform knowledge on suicide. This initiative was named in honor of Emile Durkheim, a sociologist who pioneered linguistics as a tool to model human behavior, including suicide, as described in his 1897 book Suicide.
The project was run by a multidisciplinary team of artificial intelligence and medical experts from the Geisel School of Medicine at Dartmouth, the U.S. Department of Veterans Affairs, Patterns and Predictions and big data firm Cloudera. Together these professionals formed a team dedicated to applying big data research on suicide at the intersection of artificial intelligence and medicine.
The project completed its last testing phase earlier this year. Intiially, the project team successfully identified suicidal intent better than state‐of‐the‐art medical approaches by analyzing the text derived from medical records. They then built a large scale opt‐in network to help medical professionals and social workers intervene – the new artificial intelligence solution. It is garnering bi-partisan Congressional attention for its potential as a “game-changing” aid in isolating risk in the veteran suicide crisis.
Protecting US veterans
In the wake of this fall’s Armed Services Committee Hearings, Congressman Duncan Hunter, a former Marine who served in both Iraq and Afghanistan, was prompted to take action. In late October, Rep. Hunter issued an urgent letter calling on Secretary of Defense Ashton Carter to bolster the DoD’s core strategies for suicide risk reduction among veterans, arguing that veterans would be better protected with a national strategy “...which would provide a more accurate indicator of at-risk service members and allow for successful intervention and treatment.”
Rep. Hunter also advocated the use of artificial intelligence technology, which has proven more effective than traditional medical methods in health monitoring and the detection of suicide risk among at-risk veterans. Specifically, the Congressman cited research findings from The Durkheim Project. In the letter, he wrote:
“It has come to my attention that the Defense Advance Research Project Agency (DARPA) funded a pilot study using big data to identify and predict mental health and suicide risk. This study, known as the Durkheim Project, took advantage of new data sources, better technologies, and advances in predictive analytics to successfully isolate suicidal intent better than state-of-the-art medical practice.”
In the letter, Rep. Hunter requested Secretary Carter’s answers to key questions in three important areas related to veterans' suicide:
- How has suicide risk been reduced? Has there been a quantifiable reduction in the number of active duty and veteran service members who have taken their lives?
- What are the latest efforts in risk detection? Is the military isolating those who are truly at risk versus those who are seemingly at risk in enough time to reduce the likelihood of the event?
- Should the DoD employ “precise medicine” to isolate at-risk individuals for interventions/treatment rather than employing broad-based efforts?
In closing, Rep. Hunter wrote:
I respectfully request your response to the questions raised in this letter and ask that you address the DOD's core strategy for a risk reduction plan – in particular how DoD can employ the tools used in The Durkheim Project to reduce the high rate of suicide among our active duty personnel and our veterans.
Congressman Hunter’s initiative is a welcome new voice in the fight to protect America’s veterans against suicide. And out of The Durkheim Project has emerged a new team, which is now proposing an “end-to-end” national strategy that incorporates technological advances integrated with medical resources.
This under development program recommends four essential aspects to reducing suicide among veterans:
- Outreach: Sign up hundreds of thousands of veterans (rather than the small, isolated “pilot” programs currently in place).
- Digital Risk Assessment: Use state-of-the-art data mining to isolate mental health risk factors for participating veterans.
- Resource Allocation: Maximize available resources – through enabling interventions by both medical professionals and peers.
- Effective Intervention: Both quantify and qualify the efficacy of the risk – remediation efforts.
The proposed new system will be built in partnership between Patterns and Predictions and Cloudera, both participants in the original Durkheim work.
America’s active duty and veteran soldiers are facing challenging mental health issues. With The Durkheim Project, the research team demonstrated that technology and medical innovation can play an important role in providing that desperately needed help. The question now is not one of technological prowess, but of our military and political leaders’ readiness to recognize veterans’ needs, and to embrace proven methods to provide the protection they deserve.
About the book
Artificial Intelligence in Behavioral and Mental Health Care (Elsevier, 2015) summarizes recent advances in artificial intelligence as it applies to mental health clinical practice. Each chapter provides a technical description of the advance, review of application in clinical practice, and empirical data on clinical efficacy.
In addition, each chapter includes a discussion of practical issues in clinical settings, ethical considerations, and limitations of use. The book encompasses AI based advances in decision-making, in assessment and treatment, in providing education to clients, robot assisted task completion, and the use of AI for research and data gathering.
The editor is Dr. David D. Luxton, a Research Health Scientist at the Naval Health Research Center in San Diego and an Affiliate Associate Professor in the Department of Psychiatry and Behavioral Sciences at the University of Washington School of Medicine in Seattle. He previously served as a Research Psychologist and Program Manager for the US Army and as a Secure Communications Systems Technician in the US Air Force.
Dr. Luxton's research and writing is focused in the areas of military and veterans’ health, telehealth, mobile health, artificial intelligence, and emergent technology applications in healthcare. He serves on various national committees and workgroups and he provides training and consultation regarding the use and development of technology in healthcare. He is a licensed clinical psychologist in the State of Washington.
Elsevier Connect Contributors
Chris Poulin is a contributing author to this book; he led the research team’s work described in Chapter 9: "Public Health Surveillance: Predictive Analytics and Big Data." He is currently Principal Partner of Patterns and Predictions, a big data prediction company. Was Director of The Durkheim Project, a nonprofit big data collaboration with the U.S. VA and Facebook, Inc. Prior, he was Co-Director of the Dartmouth Metalearning Working Group at Dartmouth College, working on large-scale machine learning. He has also lectured in artificial intelligence at the US Naval War College.
Gregory Peterson (@gpeterson) is a communications consultant and writer. His company is Archetype Communications – a Boston Public Relations firm. Gregory holds degrees in law, public administration and communications. He was a Fellow at the Harvard Kennedy School and the Humphrey Institute of Public Affairs, and served as an adjunct lecturer at Boston University’s College of Communication. He was Project Director at TED Conferences, VP of Public Affairs at Biogen, Communications Director for the Governor of Minnesota, and a staff member at Minnesota Public Radio. You can read more about him on his website and Contently.
By Lucy Goodchild van Hilten | Posted on 19 Aug 2015
Research reveals that the age of clinicians – and their view of young people – affects whether they think video games are harmfulBy Bobby Hoffman, PhD | Posted on 14 Jul 2015
An educational psychologist reveals why it can be so tricky to interpret the behavior of others – and ourselvesBy Lucy Goodchild van Hilten | Posted on 01 Apr 2015
On April Fools' Day, we look at why the public is so quick to believe hoax stories and pseudoscienceBy Pascal Wallisch, PhD | Posted on 17 Nov 2014
What does the emerging neuroscience of psychopathy tell us about how we should deal with it?By Liz Smith | Posted on 23 Oct 2014
I assumed I would cry; I surprised myself by laughingBy David Levine | Posted on 29 Sep 2014
Experts at World Science Festival show why eyewitness testimony can be unreliable, where the science is lacking, and how police can coax people into false confessions