Image result for google headquarters
Google on Monday released a free artificial intelligence tool to help companies and organizations identify images of child sexual abuse on the internet.
Google's Content Safety API is a developers' toolkit that uses deep neural networks to process images in such a way that fewer people need to be exposed to them. The technique can help reviewers identify 700 percent more child abuse content, Google said.
"Quick identification of new images means that children who are being sexually abused today are much more likely to be identified and protected from further abuse," engineering lead Nikola Todorovic and product manager Abhi Chaudhuri wrote in a company blog post Monday. "We're making this available for free to NGOs and industry partners via our Content Safety API, a toolkit to increase the capacity to review content in a way that requires fewer people to be exposed to it."
Image result for google headquarters
The use of AI is spreading like wildfire across the tech industry for everything from speech recognition to spam filtering. The term generally refers to technology called machine learning or neural networks that's loosely modeled on the human brain. Once you've trained a neural network with real-world data, it can, for example, learn to spot a spam email, transcribe your spoken words into a text message or recognize a cat.
Internet Watch Foundation, which aims to minimize the availability of child sex abuse images online, applauded the tool's development, saying it will make the internet safer.
"We, and in particular our expert analysts, are excited about the development of an artificial intelligence tool which could help our human experts review material to an even greater scale and keep up with offenders, by targeting imagery that hasn't previously been marked as illegal material," Susie Hargreaves, CEO of the UK-based charity, said in a statement. "By sharing this new technology, the identification of images could be speeded up, which in turn could make the internet a safer place for both survivors and users."