Click to Skip Ad
Closing in...

Godfather of AI quits Google to warn us of the dangers of products like ChatGPT

Published May 1st, 2023 11:19AM EDT
Customer service robot
Image: phonlamaiphoto/Adobe

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Regarded as the godfather of AI, Geoffrey Hinton left Google last week so he can speak freely about the dangers of generative AI products like OpenAI’s ChatGPT, Google’s Bard, and others. The University of Toronto professor created the neural network tech that companies use to train AI products like ChatGPT. Now, he is no longer as excited as he was about the future of AI.

According to an interview with Hinton, he worries about the immediate and more distant dangers that AI can pose to society.

Speaking with Hinton on the heels of his resignation from Google, The New York Times briefly recapped the professor’s illustrious career.

Hinton began working on neural networks in 1972 as a graduate of the University of Edinburgh. In the 1980s, he was a professor at Carnegie Mellon University. But he traded the US and the Pentagon’s AI research money for Canada. Hinton wanted to avoid having AI tech involved in weapons.

In 2012, Hinton and two of his students created a neural network that could analyze thousands of photos and learn to identify common objects. Ilya Sutskever and Alex Krishevsky were those students, with the former becoming the chief scientist at OpenAI in 2018. That’s the company that created ChatGPT.

Google spent $44 million to purchase the company that Hinton and his two students started. And Hinton spent more than a decade at Google perfecting AI products.

Open AI's ChatGPT start page.
Open AI’s ChatGPT start page. Image source: Jonathan S. Geller

The abrupt arrival of ChatGPT and Microsoft’s rapid deployment of ChatGPT in Bing kickstarted a new race with Google. This is competition that Hinton did not appreciate, but he chose not to speak on the dangers of unregulated AI while he was still a Google employee.

Hinton believes that tech giants are in a new AI arms race that might be impossible to stop. His immediate concern is that regular people will “not be able to know what is true anymore,” as generative photos, videos, and text from AI products flood the web.

Next, AI might replace humans in jobs that require some sort of repetitive tasks. Further down the line, Hinton worries that AI will be allowed to generate and run its own code. And that could be dangerous for humanity.

“The idea that this stuff could actually get smarter than people — a few people believed that,” the former Google employee said. “But most people thought it was way off. And I thought it was way off. I thought it was 30 to 50 years or even longer away. Obviously, I no longer think that.”

Hinton clarified on Twitter that he didn’t leave Google to criticize the company he worked at until last week. He says that Google “has acted very responsibly,” on AI matters so far.

Hinton hopes that tech companies will act responsibly and prevent AI from becoming uncontrollable, he told The Times. But regulating the AI space might be easier said than done, as companies might be working on the tech behind closed doors.

The former Googler said in the interview that he consoles himself with the “normal excuse: If I hadn’t done it, somebody else would have.” Hinton also used to paraphrase Robert Oppenheimer when asked how he could have worked on technology that could be so dangerous: “When you see something that is technically sweet, you go ahead and do it.”

But he doesn’t say that anymore. The Times’ full interview is available at this link.

Chris Smith Senior Writer

Chris Smith has been covering consumer electronics ever since the iPhone revolutionized the industry in 2008. When he’s not writing about the most recent tech news for BGR, he brings his entertainment expertise to Marvel’s Cinematic Universe and other blockbuster franchises.

Outside of work, you’ll catch him streaming almost every new movie and TV show release as soon as it's available.