Nothing kills more people in the U.S. than heart disease, and researchers within the UPMC health enterprise are using machine learning to advance the breadth and efficiency of electrocardiogram (ECG) readings to better identify patients potentially experiencing a heart attack that health care providers might otherwise miss.

The developmental algorithm’s results outperformed current gold standards for detecting heart attacks, according to a novel study published in the peer-reviewed medical journal, Nature Medicine.

Work is underway to incorporate the algorithm into a digital dashboard for ambulance providers and emergency department staff to more quickly triage patients suffering the most severe heart attacks caused by total blockage of coronary arteries – the type most difficult to detect from an ECG and in immediate need of cardiac catheterization.

“We’re doing things with this technology that human beings cannot do,” said Dr. Christian Martin-Gill, UPMC’s Chief of the Division of Emergency Medical Services (EMS).

The machine-learning tool uses advanced math, computation and computer engineering to identify more than 700 different features on an ECG, he said.

“When a human is looking at an (ECG) tracing, we’re probably looking at several dozen different things we may recognize. … A computer is able to look at hundreds and it’s able to do that within minutes,” Martin-Gill said.

Artificial intelligence (AI) and its subsets such as machine learning are changing how first responders do their jobs, be it through robotics, speech recognition and advanced algorithms.

Firefighters in California partnered with the University of California San Diego’s ALERTCalifornia program, using artificial intelligence to scan a network of 1,032 cameras to detect fires and other abnormalities and notify first responders for further investigation.

A video analytics system powered by artificial intelligence is now in use at the Michigan State Capitol to detect whether someone is smuggling a gun into the building, with images shared instantly to trained staff to determine if there’s a legitimate threat. The software can share detailed alerts, including imagery and suspect location with internal and external law enforcement in as little as three seconds.

Facial recognition technology is increasingly used within law enforcement, a development raising concerns among civil liberty and privacy advocates.

The U.S. Government Accountability Office released a report last year that found that in 2020, 18 of 24 federal agencies with law enforcement officers used AI largely for building surveillance and computer access.

In a separate survey, 14 of 42 such agencies said the technology was being used in criminal investigations.

On GAO’s recommendation, three of the agencies have since implemented tracking systems.

Ten others were conducting research on facial recognition technology including the Department of Justice which was conducting applied research on the relationship between skin tone and false match rates.

‘Call after call’

“911 is now a tech industry,” said Anthony Mignogna, Chief of Communications at Delaware County Emergency Services in southeast Pennsylvania.

Artificial intelligence is incorporated into the Delaware County 911 Center, helping dispatchers handle about 800,000 emergency calls placed to the center.

The center uses Prepared AI to transcribe the calls, having rolled out the technology in mid-October. It serves to verify what a caller is saying, Mignogna said, allowing dispatchers to marry what they hear with what they’re reading.

“We want to know when someone says ‘gun,’ ‘shot,’ ‘not breathing,’ ‘car.’ It flags that,” said Mignogna, who sits on the Prepared company’s inaugural customer advisory board along with other first responders. “It helps us expedite our call processing time.”

It also helps, he said, when the center is short-staffed and experiencing heavy call volume.

More than half of 911 centers in the U.S. are facing staffing emergencies, according to a study released in February by the International Academies of Dispatch and the National Association of State 911 Administrators. Programs like that used in Delaware County are becoming increasingly more common.

As explained by the National Urban Security Technology Laboratory of the Department of Homeland Security, AI technology picks up on direct conversations and background noise. It compares call information to thousands of past data points, according to the federal agency, and can suggest relevant questions for the dispatcher to improve call efficiency and emergency response.

Mignogna said a year-end analysis will determine just how well AI is functioning in the Delaware County 911 Center. He’s already looking toward broadening the use of the technology including foreign language translation, imaging, video and GPS.

“We’re looking for tools like this to help and take some stress off (dispatchers). It’s call after call after call for 12 hours. It makes life a little easier,” Mignogna said.

Helpful tech, be cautious

Professor Soundar Kumara, director of Penn State University’s Center for Applications of Artificial Intelligence and Machine Learning to Industry, said Generative AI is becoming more commonplace within emergency response.

“Computers are not good at understanding context,” Kumara said. “To understand context, Generative AI can help.”

Generative AI can analyze information about past incidents alongside incoming data about a current emergency to quickly determine where specific resources would best be dispatched, he said, adding that it would be particularly helpful to boost emergency response as volunteerism in fire and emergency medical services continues to shrink.

“A huge problem is listening to people, translating, identifying what are the important variables that would increase the need for intervention immediately so you can do better resource allocation leading to better outcomes,” Kumara said.

The use of Predictive AI in law enforcement does present ethical challenges, Kumara said. He likened its use to the “pre-crime” police program in the Tom Cruise movie, “Minority Report.”

The technology can build off of historic data to determine geographic areas that tend to be more violent than others, he said, perhaps leading to increased police patrols. But, such technology can also lead to dangerous stereotypes and unintended consequences when data is limited to certain ethnic groups or races, Kumara warned.

“In order to avoid this, you need to sample the population properly. When you do that you have better understanding of the nuances of how these patterns among people are changing,” Kumara said.

“For all this, you need a lot of data from the past. When you don’t have the data your guess is as good as an algorithm’s guess.” 

Eric Scicchitano is the CNHI Pennsylvania statehouse reporter. Follow him on Twitter @ericshick11.

Trending Video

Recommended for you