Google has developed an artificial intelligence-powered tool that can help people understand what’s going on with issues related to the skin.
The Mountain View, CA-based company debuted the web-based application earlier this week during I/O, an annual developer conference. In a blog post, Google noted the tool isn’t available yet and hasn’t been evaluated by FDA.
Here’s how it works. Once the tool is launched, a person would use their phone’s camera to take three images of the skin, hair, or nail concern from different angles. The person will then be asked questions about skin type, how long they’ve had the issue, and other symptoms that help the tool narrow down the possibilities. The AI model analyzes this information and draws from its knowledge of 288 conditions to give a list of possible matching conditions that can then be researched further.
Google said the model accounts for factors like age, sex, race, and skin types.
With this project, Google is dipping a little more heavily into the healthcare space. Google stepped back from healthcare a bit after the company went through a restructuring in 2015. During this time, Alphabet became the parent company of Google and several of its subsidiaries. One of those was Verily Lifesciences – which was behind many of Google’s healthcare-focused projects.
However, Google has still had a hand in healthcare. Last year, the company joined forces with Apple, its tech rival, to develop the Exposure Notification technology, which informs someone if they may have come in contact with a person that was diagnosed with COVID-19.
And last month, Varian announced it would work with Google Cloud AI to create AI models for organ segmentation – a crucial and labor-intensive step in radiation oncology that can be a bottleneck in the cancer treatment clinical workflow.