“Wound care is one of today’s most expensive and overlooked threats to patients and our overall healthcare system,” said Robert Fraser of Western University and Swift Medical Inc, corresponding author of the study published in Frontiers in Medicine. “Clinicians need better tools and data to best serve their patients who are unnecessarily suffering.”
Shedding light on injuries
The scientists developed a device called the Swift Ray 1 which can be attached to a smartphone and connected to the Swift Skin and Wound software. This can take medical-grade photographs, infrared thermography images (which measure body heat), and bacterial fluorescence images (which reveal bacteria using violet light).
None of these images would be enough to identify infection alone. Clinical inspection has low accuracy, as does thermography measuring heat changes caused by inflammation and infection. Bacterial fluorescence can only look at the surface of a wound, which is naturally contaminated with bacteria, so additional methods are needed to differentiate between contamination and an infected wound.
“Research has demonstrated bacterial imaging helps guide clinicians’ work to remove nonviable tissue, yet it cannot identify infection by itself,” explained Dr Jose Ramirez-GarciaLuna of McGill University Health Centre, first author of the study. “Thermography provides insight into the inflammatory and circulatory changes happening under the skin.”
The scientists sought to combine these modalities to come up with a method which wouldn’t need multiple expensive devices, would overcome the weaknesses of each imaging method, and could provide an objective measure of wound healing.
To test their device, they recruited 66 wounded patients. Their wounds showed no sign of infection spreading further, did not contain foreign bodies, and had not previously been treated with antibiotics or growth factors. The patients’ wounds were uncovered, cleaned, and dried before imaging, and afterwards cared for as usual.
A picture of health
The images were reviewed by a researcher who wasn’t present for the wound care process. Four patterns were identified.
Wounds where the wound was not warmer than healthy skin and no bacterial fluorescence was present were considered ‘non-inflamed’, while wounds that were slightly warmer than healthy skin and had no or slight bacterial fluorescence were considered ‘inflamed’. The last two patterns — wounds that were substantially warmer, with or without bacterial fluorescence — were both designated as ‘infected’, because all the clinicians who had examined these wounds had considered them infected.
Out of the 66 wounds, 20 were considered non-inflamed, 26 were inflamed, and 20 were infected.
The researchers performed principal component analysis and used an algorithm called nearest k-neighbor clustering to see if a machine learning model could accurately identify these different categories of wound. They found that the model could identify all three very well, with an overall accuracy of 74%. When differentiating between infected vs. non-infected wounds, the model correctly identified 100% of infected wounds and 91% of non-infected wounds.
A new tool in the box
The researchers pointed out that the images should always be considered in their medical context. For instance, a wound that is cool enough to be categorized as non-inflamed could have a limited blood supply, compromising healing. But because the Swift Ray 1 combined with the Swift Skin and Wound software allows doctors to combine multiple modalities of identifying infection, it increases the tools available to them without demanding the use of several expensive devices. In the future it could make it possible to secure a rapid, accurate diagnosis for every wounded patient and enable more effective telemedicine assessments.
“This was a pilot study and follow up studies are planned,” cautioned Fraser. “In the future, patient populations with more wound types are required to validate across populations.”