Zeyun Yu (left) and Sandeep Gopalakrishnan

Health professionals treating non-healing wounds rely heavily on comparing photos taken at each patient visit to track the healing trajectory. But 2D pictures offer limited information about the wound, and care is interrupted when patients skip or cannot make regular appointments.

“When I ask colleagues who work with these patients, ‘What are the tools that you use to characterize the wound?’ they say they use manual methods to collect wound data using a ruler and Q-tip,” said Sandeep Gopalakrishnan, an assistant professor in the UWM College of Nursing.

Gopalakrishnan and Zeyun Yu in the College of Engineering & Applied Science teamed up to develop a smartphone-based digital platform that could help clinicians improve treatment and accelerate healing.

First, they created an app that patients and caregivers can use to capture wound photos themselves at home. Algorithms using artificial intelligence (AI) could then process the images and provide clinicians with accurate information on the healing characteristics.

I-Corps program helps

To determine whether to commercialize their app, Gopalakrishnan and Yu participated in the UWM-administered I-Corps Program, which teaches academic researchers how to turn discoveries in the lab into products and startups. Through I-Corps, which is supported by the National Science Foundation, the two interviewed scores of potential users of the technology to zero in on clinicians’ top needs.

In the process, they met Milwaukee physician Jeffrey Niezgoda, a recognized wound care expert at AZH Wound and Vascular Centers. Niezgoda suggested ways to expand their initial business idea.

Photos contain a rich source of data. If you have enough of them, machine-learning algorithms running in the cloud could help health care providers precisely monitor a wound’s status.

“Not all clinicians identify the types of wounds correctly to begin with, and the kind of interventions that are needed are all very different,” said Yu, a professor of computer science and biomedical engineering. “So, our platform ideally will be able to process images to classify and segment wounds, while at the same time, provide a tool that gives clinicians predictability of the healing potential.”

Since completing the I-Corps training last year, the three formed a startup company called MegaPerceptron to take the system to market.

Training the program

Yu used a sizable set of different wound images from Niezgoda’s practice to “train” the AI program, which supports the prediction and analysis functions. Machine-learning algorithms are those that can detect patterns and improve their predictive performance with increasing amounts of data.

“For the wound-classification problem, we have used thousands of wound images,” Yu said. “With this amount of data to train the AI algorithm, we are able to classify the wound types with more than 90% accuracy.”

Contributing even simple information makes a difference in accuracy, Gopalakrishnan said. “There are kinds of wounds that are found only on certain parts of the body,” he said. “Once you’ve diagnosed the type, then that drives the algorithms. That makes our platform robust.”

Integrating AI and wound care is still a fledgling field, the researchers said, but since most people today have smartphones that take high-quality photos, the timing for this product is right.