This article is more than 1 year old

It's official – Google AI gives you cancer ...diagnosis in real time: Neural net can spot breast, prostate tumors

Boffins spill beans on super 'scope machine-learning tech

Google Health's so-called augmented-reality microscope has proven surprisingly accurate at detecting and diagnosing cancerous tumors in real time.

The device is essentially a standard microscope decked out with two extra components: a camera, and a computer running AI software with an Nvidia Titan Xp GPU to accelerate the number crunching. The camera continuously snaps images of body tissue placed under microscope, and passes these images to a convolutional neural network on the computer to analyze. In return, the neural net spits out, in real time allegedly, a heatmap of the cells in the image, labeling areas that are benign and abnormal on the screen for doctors to inspect.

Google's eggheads tried using the device to detect the presence of cancer in samples of breast and prostate cells. The algorithms had a performance score of 0.92 when detecting cancerous lymph nodes in breast cancer and 0.93 for prostate cancer, with one being a perfect score, so it’s not too bad for what they describe as a proof of concept.

Details of the microscope system have been described in a paper published in Nature this week. The training data for breast cancer was taken from here, and here for prostate cancer. Some of the training data was reserved for inference testing.

The device is a pretty challenging system to build: it requires a processing pipeline that can handle, on the fly, microscope snaps that are high resolution enough to capture details at the cellular level. The size of the images used in this experiment measure 5,120 × 5,120 pixels. That’s much larger than what’s typically used for today's deep learning algorithms, which have millions of parameters and require billions of floating-point operations just to process images as big as 300 pixels by 300 pixels.

psychosis

Boffins' neural network can work out from your speech whether you'll develop psychosis

READ MORE

To cope with these larger images, the convolutional neural network, which is based on Google's Inception V3 architecture, breaks them up into little patches that are analysed individually. It also takes time to train the technology to detect and classify cancerous cells, with the help of humans, from pictures of varying levels of quality. All of this then has to work in real time during the inference stage in order for it to be useful for doctors: they'd like to know as soon as possible, not hours or days later.

“The latency quantifies the absolute computational performance of the ARM [augmented-reality microscope] system,” the researchers wrote. Although they used it to study cancer, they believe device might prove useful for other applications too.

“Beyond the clinic, the ARM could potentially be useful as a teaching tool by leveraging reverse image search tools that can help trainees quickly search reference resources and answer the question ‘what is this histologic feature that I am looking at?’ More experienced doctors could also leverage the ARM for clinical research to prospectively validate AI algorithms not previously approved for patient care, such as mutational status or microsatellite instability predictions.”

ARM is also promising for another reason. It’s cheaper than “conventional whole-slide scanners” by about one or two magnitudes, apparently. We have asked Google for more comment. ®

More about

TIP US OFF

Send us news


Other stories you might like