HEIDELBERG, Germany — In another iteration of the age-old battle between humans and the machines we create, an artificial intelligence program trained to identify and diagnose skin cancer in humans proved to show greater accuracy when pitted against dermatologists.

Researchers from the U.S., Germany, and France used an AI system called the Convolutional Neural Network (CNN) to detect skin cancer by showing it more than 100,000 images of malignant melanomas and benign moles. They then compared the AI’s performance in successfully identifying them on human patients with the performance of 58 international dermatologists. They found that the CNN detected more melanomas and correctly identified more benign tumors than the dermatologists.

Stethoscope next to laptop
A new study found an artificial intelligence program trained to identify and diagnose skin cancer in humans proved to show greater accuracy when pitted against dermatologists.

Using the same concepts that a human brain uses for vision, the researchers were able to train the CNN to learn from the images it “sees’ and teach itself to improve its performance.

“The CNN works like the brain of a child. To train it, we showed the CNN more than 100,000 images of malignant and benign skin cancers and moles and indicated the diagnosis for each image. With each training image, the CNN improved its ability to differentiate between benign and malignant lesions,” explains lead author Holger Haenssle, a professor at the University of Heidelberg, in a release by the European Society for Medical Oncology.

“After finishing the training, we created two test sets of images from the Heidelberg library that had never been used for training and therefore were unknown to the CNN,” adds Haenssle. “One set of 300 images was built to solely test the performance of the CNN. Before doing so, 100 of the most difficult lesions were selected to test real dermatologists in comparison to the results of the CNN.”

The 58 dermatologists came from 17 different countries. Thirty of them had at least five years of experience. In one experiment, the doctors were shown just the image of the skin growth and asked to diagnose it as either a malignant melanoma or benign mole, and decide what type of action should be taken, if any, for the condition. A second experiment had the doctors examine closeup images of the growths, but this time they were also given information about the patients. Again they were asked to make a diagnosis and decision for follow-up care.

Even though the dermatologists still accurately detected about 87% of malignant melanoma cases on average in the first experiment and 89% in the second, the artificial intelligence still proved to be superior in its detection — correctly diagnosing 95% of cases.

“The CNN missed fewer melanomas, meaning it had a higher sensitivity than the dermatologists, and it misdiagnosed fewer benign moles as malignant melanoma, which means it had a higher specificity; this would result in less unnecessary surgery,” says Haenssle.

But the study authors don’t believe doctors should panic about their jobs being taken over by robots or computers any time soon. Instead, they see the technology as complementary to medical experts.

“This CNN may serve physicians involved in skin cancer screening as an aid in their decision whether to biopsy a lesion or not. Most dermatologists already use digital dermoscopy systems to image and store lesions for documentation and follow-up. The CNN can then easily and rapidly evaluate the stored image for an ‘expert opinion’ on the probability of melanoma,” says Haenssle. “We are currently planning prospective studies to assess the real-life impact of the CNN for physicians and patients.”

They also say AI still needs to be developed that can better identify cancers in more difficult areas like fingers, toes, or on the scalp.

The full study was published in the May 28, 2018 edition of the journal Annals of Oncology.

About Ben Renner

Writer, editor, curator, and social media manager based in Denver, Colorado. View my writing at

Our Editorial Process

StudyFinds publishes digestible, agenda-free, transparent research summaries that are intended to inform the reader as well as stir civil, educated debate. We do not agree nor disagree with any of the studies we post, rather, we encourage our readers to debate the veracity of the findings themselves. All articles published on StudyFinds are vetted by our editors prior to publication and include links back to the source or corresponding journal article, if possible.

Our Editorial Team

Steve Fink


Chris Melore


Sophia Naughton

Associate Editor