Machine Learning in Mental Health at Mass General
In This Article
- Jordan Smoller, MD, and colleagues at Massachusetts General Hospital are evaluating big data and machine learning to better understand suicide risk in patients
- Many risk factors identified had already been taken into account, but others sparked further interest, such as certain kinds of infections and a history of specific orthopedic fractures
- The team is working to turn the algorithm into a tool to supplement and inform clinical judgment
In his work in mental health and psychiatry, Jordan Smoller, MD, ScD, with the help of colleagues Ben Reis, PhD, and Matthew Nock, PhD, is tapping into the immense potential at the junction of big data and machine learning.
Subscribe to the latest updates from Neuroscience Advances in Motion
His particular data sets are the unfathomable numbers of electronic health records (EHR). The ultimate goal is a clinical decision support tool that could be used on-demand and in real time when a patient visits a physician to determine if he or she is at an increased risk of suicide—and to inform or motivate more comprehensive screening and potential intervention by the clinician.
Dr. Smoller is well on his way to achieving that aim. He and his team first began working with large longitudinal EHR databases in 2009 and, thanks to support from the Tommy Fuss Fund, were able to spend time using the data to develop a risk prediction algorithm, or what Dr. Smoller calls a suicide "early warning alert system." It incorporates more than 30,000 different potential predictors or variables from the EHR, each contributing to an overall patient risk score. On the procedure's initial test within the Partners HealthCare system, it was able to detect around 45% of suicide attempts with 90% specificity an average of two to three years in advance.
"The sad fact is that suicide is one of the leading causes of death in this country and is the second-leading cause of death among young people," says Dr. Smoller. What's more, he notes, instances of suicide are on the rise, yet health care providers in his shoes don't currently have an effective way to forecast risk.
"Clinicians essentially do no better than chance at making accurate predictions," he says. "This is a tremendous opportunity, we think, to use big data for real-world benefit."
Of the 30,000 variables involved in the algorithm, many of those that turned up on its initial test as significant determiners of risk are factors clinicians could have feasibly identified on their own, including:
- Mental health conditions
- Substance use issues
- The use of psychiatric medications
But surprise risk barometers surfaced as well, including certain kinds of infections and a history of specific orthopedic fractures.
"Those are the things that are interesting," Dr. Smoller says, "because no human being could process all of that information simultaneously in a clinical encounter."
He further explains that "With big data and machine learning, you have the ability to potentially find patterns or predictive profiles that may incorporate indicators of risk you would never have thought of."
He cautions, though, that while factors may emerge as important, that doesn't mean they're causally related to suicide.
Thus, the next steps for Dr. Smoller's team include grappling with how, as developers of the algorithm and forthcoming tool, to effectively communicate risk to a clinician while also making clear that it doesn't offer a perfect prediction and false-positives are likely.
"It's really meant to inform rather than replace clinical judgment," Dr. Smoller says.
He and his team recently began the process of prototyping the app that will actually turn their algorithm into a clinical tool, but Dr. Smoller acknowledges it's not ready for primetime. They're continuing to refine and validate the predictive algorithm through performance tests in other health care systems across the country. Preliminary data indicates it performs as well in these systems as it did at Partners. Eventually, the tool will go through a formal clinical trial so that Dr. Smoller can confidently say whether or not the information it provides will make a positive difference for medical practitioners.
These verification steps may take several years, but he's willing to wait.
"The need for and the clinical importance of this is something that I face regularly in trying to care for patients," Dr. Smoller says. "Until this opportunity of big data came along, the idea of a real clinical decision support tool wasn't something I even anticipated because it wasn't feasible."
Learn more about at the Department of Psychiatry