Skip to content

Why Mass General Neurologists Are Animated About Big Data

In This Article

  • At the Massachusetts General Hospital Clinical Data Animation Center (CDAC), neurologists are using big data to better understand neurologic illness
  • The CDAC is using big data to train machine learning to help make inferences about diagnosis
  • Neurologists will work together with computers, artificial intelligence and machine learning to make better informed decisions for patient care

At the Massachusetts General Hospital Clinical Data Animation Center (CDAC), they aren't busy sketching cartoons and making movies. Rather, they're bringing big data to life for the benefit of patients' lives.

Prior to co-founding CDAC in 2016 with their colleagues Matt Bianchi, MD, PhD, and Eric Rosenthal, MD, neurologists Sydney Cash, MD, PhD, and Brandon Westover, MD, PhD, were used to analyzing very specific kinds of data: information from specialized microelectrode recordings (Dr. Cash) and multiple day-long EEGs in patients in the ICU (Dr. Westover).

It's their individual interests in better understanding and diagnosing epilepsy and caring for the brains of patients with neurologic illness that brought them together. Dr. Westover sought Dr. Cash as a mentor, and they began working with much larger data sets than they had previously.

In helping each other with their research, the two quickly realized there was a wealth of information beyond neurological recordings that could be useful in their work, from temperature measurements to blood pressure readings to ventilator settings. What's more, they realized this information could have broad-reaching impact for Mass General and its patients, yet no one was really combining and analyzing data from such various streams. So, with the support of the Neurology Chief, Merit Cudkowicz, MD, MSc, they set out to do something, well, big.

Animating Better Care

"The real power, we feel, is in being able to combine multiple modes of information to get at questions of real significance for our patients and for how the hospital runs," says Dr. Cash, "how we can deliver care better and more effectively."

To find answers to those questions, CDAC's team not only integrates but also animates big data, meaningfully organizing and displaying it, Dr. Cash explains, so that it actually has impact. In their hands, the data becomes more than just numbers on a screen. Statistics are transformed into dynamic graphs and charts with vivid colors, symbols and illustrations.

"It's not just a collection of data sitting in a computer," Dr. Cash expounds. "It does something useful. It lets you predict who will have trouble and how you can intervene."

For example, CDAC is working with Erica Shenoy, MD, PhD, in Mass General's Infectious Diseases Division to better understand why patients come down with hospital-acquired infections—chiefly pneumonia. They've already developed ways to capture data from all monitors in the ICU room and to archive the sizable amounts of streaming, continuous content. Soon, they will send regular automated reports that show when and why patients develop pneumonia, so trends can be spotted to ultimately help lessen the number of pneumonia cases. Until now, ventilator data displayed in a 10-second window by the bedside and then vanished.

They're also collaborating with Taylor Kimberly, MD, PhD, associate director of the Neurosciences Intensive Care Unit, to track when patients are ready to leave neurocritical care and when they're actually moved. They've presented the data—as a sophisticated scatter plot—to Dr. Kimberly for root-cause analysis. The goal was to help him pinpoint the exact hours of the day and the problems most likely to cause delays to then improve transfer times, so those who need critical attention have a bed and so patients leaving the ICU can sooner get the different kinds of care they require.

On the Cutting Edge: Where Big Data Meets Artificial Intelligence

The place where big data really takes on a life of its own, Dr. Westover feels, is at its intersection with machine learning, more specifically deep learning. Machine learning is just as it sounds: the ability for computers to learn without being explicitly programmed; deep learning brings machine learning closer to artificial intelligence (AI).

In other words, "Not only do you have all this data," Dr. Westover shares, "but you train machines to make smart inferences."

For Dr. Westover, involvement in AI territory began with a career quest to make computers able to read EEGs and, potentially, accurately diagnose epilepsy. It's this aim that led him to Dr. Cash and big data in the first place.

"To try to teach computers how to do the things that doctors do when they're interpreting brainwaves, we realized that we really needed large data sets," he notes. "The impressive thing that experts can do is they can sift through all the noise to detect patterns that are the same across patients, even though the background and the details are very different. To teach a computer to do that, you need what a human gets during years of medical training: lots and lots and lots of examples to learn from."

He went out and got those examples. CDAC has 300-plus terabytes of data, with more coming in nearly every second, including tens of thousands of clinical EEGs performed at Mass General over the last decade.

Thanks to those recordings, Dr. Westover has, in fact, been able to teach computers to effectively and intelligently read many aspects of EEGs. He and the team at CDAC developed an algorithm to distinguish abnormalities—some very subtle—that occur in patients with epilepsy. Using it, a computer can now recognize epileptic spikes about as well as humans can. And thanks to 10,000-plus EEG recordings and other sensory data from the Mass General Sleep Laboratory, where Dr. Westover serves as director of sleep informatics, computers can now analyze sleep stages, breathing problems (apneas), and limb movements as indicators of neurologic disease nearly as accurately as an individual.

Dr. Westover's group used AI-based technology to train a computer, also via an algorithm, to successfully reproduce the work that sleep technicians and physicians do in a laboratory when they analyze a patient's data to make a clinical diagnosis. He's excited by what this algorithm might mean for labs and patients.

"The time and manpower savings for labs is potentially pretty substantial," he shares, noting that it typically takes a technician two hours to review and annotate a sleep study and then a physician roughly 30-60 minutes to follow-up. "That can all be reduced to five or 10 minutes total, the whole three hours. So, labs should be able to serve more patients."

Now that the computer knows what it's doing, he's working out the human part of the equation. He and Kosta Stojanovic, MD, a current Mass General fellow in healthcare innovation and CDAC researcher, are launching a software pilot at Mass General. Their software is assessing how to best animate the data (the right colors and layout) so that the information this algorithm provides will be easy for practitioners to understand and use. Ultimately, the software will generate a clear final report much like doctors write now.

Artificial Intelligence and the Brain Age Index

When it comes to sleep EEG data in particular, Dr. Westover and his team at CDAC aren't stopping at the success of this algorithm and their future plans for its clinical implementation.

After developing the complex procedure, they wondered if they could predict how old someone is solely from an overnight recording of their brain activity. Knowing that EEGs recorded at different stages of a person's life show a systematic progression of brain changes, Haoqi Sun, PhD, a CDAC research fellow, mined the sleep study data again—the largest clinical sleep data set anywhere—in an attempt to work backwards.

Sure enough, they found that a computer can predict a patient's age through brain activity with an 85% correlation to their true age. But that didn't turn out to be their most interesting discovery. The computer's errors did.

"We never get it exactly right," Dr. Sun explains. "Sometimes, someone is 40 but the computer says their brain looks like someone's who is 50. Or, the computer thinks their brain looks like that of someone who is 30."

He continues, "It occurred to us that if your brain looks like that of a 50-year-old and you're only 40, this might be something to worry about." Dr. Westover and his team termed the difference between how old one's brain looks to the computer and one's actual chronologic age the "brain age index" (BAI), concluding, "The BAI seems to be a measure of brain health."

In fact, patients who have a low or negative BAI—whose brains look younger than they actually are—have a better life expectancy than those whose brains look older than they really are. Using information from the Sleep Heart Health Study, Drs. Westover and Sun, along with Luis De Carvalho Paixao, MD, MSc, a CDAC research fellow, determined that people with BAIs in the lower quartile have a 26% higher chance of living to age 87 than those with BAIs in the upper quartile.

Dr. Westover believes this finding may be especially important for people with chronic diseases like hypertension or diabetes. He notes that it's routine, say, for physicians to monitor a diabetic's eyes to ensure diabetes isn't affecting their ocular blood vessels, which could result in blindness.

"But we've never had a way to measure the disease's effect on the brain," he reflects. "We think the BAI could become a standard part of checking up on your health and getting a more serious warning signal if things are looking worse." It could even help detect early signs of dementia. After all, he says, "The potential treatments that we have or are considering are probably not going to work once it's too late."

Equally, the BAI could be used by those who are healthy and who simply want to know how their brain is aging as they do. "This could become the Fitbit of the brain," Dr. Westover hopes, revealing one's neurological health through the rhythms of their sleep.

Machines vs. Men

The CDAC team has even used the same principles of deep learning and AI to mitigate an extremely difficult scenario neurologists face: being asked to predict whether patients who've gone into coma after cardiac arrest, and experienced some level of brain damage or injury, will wake up.

"Until now, it's been pretty subjective," shares Dr. Westover. "Not all neurologists have expertise in this area, and even the ones who do, there's still some judgment involved."

To get enough data to train a computer to make this decision, they formed a consortium, gathering continuous EEG recordings and cardiac arrest patient outcomes from as far away as Belgium and the Netherlands and closer to home, from Mass General as well as Yale, Beth Israel Deaconess Medical Center and Brigham and Women's Hospital. They now have the largest set of this data in the world.

With help from CDAC members MIT graduate student Mohammad Ghassemi, BSc, MPhil, and Mass General Neurocritical Care Fellow Ed Amorim, MD, they created a model that can very accurately forecast, at any given time following cardiac arrest, a probability that a patient will eventually regain consciousness.

Drs. Westover and Cash stress they aren't ready to resign and let computers do their jobs. But they are ready to see providers put their deep-learning findings into practice. Once CDAC's heart attack research is published, likely within the next year, cardiologists can begin asking machines for predictions. And they hope in the near future hospitals smaller than Mass General that don't have epilepsy specialists, or locations that may only have one, can begin to utilize their epilepsy algorithm to diagnose seizures and sidestep the need for a neurologist to spend 30-60 minutes reviewing an EEG recording.

They also want to see the same for their work with sleep data, though they acknowledge that much needs to be done for these discoveries to make their way into labs and clinics. Chiefly, that it needs to be easy for a person to administer their own sleep study at home, by placing just one or two sensors on their head before they drift off in their own bed. The technology already exists. But for the scenario to become possible, their algorithm and BAI research would need to be commercialized. A company that already makes equipment to measure sleep could integrate AI into their software or a new company focused on AI and sleep equipment could form.

Intriguingly, there are more CDAC projects in the pipeline, and they welcome potential collaborations. "We want to make our capabilities available to as many people as possible," says Dr. Cash.

Although, he promises they won't get too big with their big data, in order to ensure top-quality information and continued successful outcomes for hospitals and patients alike.

Learn more about the Department of Neurology at Mass General

Learn more about Neurology Research at Mass General

Related

Brain changes associated with Alzheimer’s disease can now be studied in living people. Mass General researchers imaged cognitively normal people who are destined to develop AD due to a rare genetic mutation. Their goal: to discover how soon brain changes are present before symptoms begin and how soon preventive treatments could be initiated.

Related

After controlling for several potentially confounding factors, neurosurgeons at Massachusetts General Hospital found that resection of progressive glioblastoma is not significantly associated with prolonged post-progression survival, even if gross total resection is achieved.