Researchers from Stanford University have developed a new artificial intelligence (AI) that is as accurate as doctors at identifying skin cancer from images, and hope to get everyday smartphones to carry it.
Once trained, the team then tested the system's ability to classify skin cancer by presenting it with just under 2,000 previously unseen images of skin lesions, whose nature had previously been determined by biopsy, and further compared the results for almost 400 of the images against the judgement of at least 21 dermatologists.
The research, published in the journal Nature, describes how the algorithm was able to match the performance of 21 board-certified dermatologists in diagnosing skin lesions-the most common and deadly form of skin cancer.
"I'm certain this is how melanomas are going to be identified in the future", says Richard Weller, a consultant dermatologist at the Royal Infirmary of Edinburgh in the United Kingdom, who was not involved in the work.
Dermatologists inspect skin for signs of cancer, relying on their training and experience.
After priming the algorithm over the course of a week with millions of images of ordinary items - dogs, tables, chairs, etc. - they then fed it a dataset of 129,450 clinical images of verified, biopsy-proven skin aberrations.
The research team isn't the only group using neural networks to identify and classify images.
But while less threatening forms of skin cancer such as basal cell carcinoma and squamous cell carcinoma are relatively easy to visually identify, it is still hard to pick out melanomas this way. Numerous images the researchers gathered from the Internet weren't taken in such a controlled setting, so they varied in terms of angle, zoom, and lighting.
The images used in the new app - representing over 2,000 different skin diseases - were gathered from the internet and vetted by dermatologists.
"Advances in computer-aided classification of benign versus malignant skin lesions could greatly assist dermatologists in improved diagnosis for challenging lesions and provide better management options for patients", said Susan Swetter, professor of dermatology and co-author of the paper. "There is therefore a possibility that if you rely on people to self-report what they are anxious about, other skin cancers - particularly in hard to see sites, e.g. the back - may be missed", she said. This ability to alter the sensitivity hints at the depth and complexity of this algorithm.
Skin cancer diagnosis typically starts with a visual exam, where a dermatologist looks at the lesion in question under a dermatoscope. The Stanford researchers adapted this algorithm to differentiate between images of malignant versus benign skin lesions. While the algorithm was developed and now exists on a computer, the team believes it could be adapted for the smartphone without too much trouble, and what a breakthrough that would be. Until researchers train the algorithm to work on people with darker skin (by showing it more examples of lesions on darker skin) the algorithm is only useful for a segment of the global population. Blau is also the director of the Baxter Laboratory for Stem Cell Biology and a member of Stanford Bio-X, the Stanford Cardiovascular Institute, the Child Health Research Institute and the Stanford Cancer Institute.