Teaching Machines to Read Color: How Researchers are Advancing the Future of Digital Diagnostics
Park and Kim are leading a research effort that rethinks how machines understand color in biomedical images. While computer vision has made leaps in analyzing shapes and patterns, it has long overlooked a surprisingly critical element: color.
“In digital diagnostics, subtle changes in color can distinguish between normal and abnormal,” Park and Kim explain through their latest work. However, traditional tools for capturing and interpreting color, such as the 50-year-old Macbeth ColorChecker, often fall short in real-world digital health settings where variations in light conditions and device characteristics can have a detrimental effect on accuracy.
To solve this, Kim’s lab developed HemaChrome, a specialized color reference chart tailored to capture hemoglobin-related shades in biological tissue. Combined with a new machine learning approach called one-shot learning, their proposed platform can recover accurate color information from photos taken under vastly different conditions. Conditions like indoors, outdoors, or even using different smartphone models.
But the real innovation lies in how the system learns. Unlike conventional AI models that rely on massive datasets, this neural network is trained on each individual photo using the HemaChrome chart included in the frame. Again, this one-shot learning approach provides the system with both flexibility and precision, making it ideal for mobile health and telemedicine.
To demonstrate its versatility, the team tested the platform across three distinct challenges. In one study, they tracked inflammation in mice exposed to UVB radiation to simulate skin cancer. In another, they analyzed smartphone photos of patients’ inner eyelids to estimate blood hemoglobin levels, offering a noninvasive option for people with conditions like sickle cell disease. They also validated the platform’s ability to perform machine-readable quantification of hemoglobin using the WHO-standard Tallquist scale as self-test samples.
Across the board, their method outperformed traditional color correction methods, offering more consistent results and greater accuracy regardless of how or where the photo was taken.
In a world that’s moving rapidly toward digital health, it’s not just about seeing better. It’s about helping machines see what really matters.
Learn more here.