‘This is not just fearmongering’: ‘Godfather of AI’ Geoffrey Hinton has a warning for society

TORONTO, Ontario — Geoffrey Hinton is one of the “godfathers” of artificial intelligence, having worked for Google as an AI scientist for more than a decade. However, after quitting his position to speak more freely about breakthroughs like ChatGPT, the respected tech guru is now issuing a stark warning about the unchecked power of AI.

At the Collision tech conference in Toronto, Hinton explained to a packed audience that governments around the world need to step up and prevent AI from literally taking control of society.

Ironically, Collision gathers over 30,000 startup entrepreneurs, eager investors, and employees in the tech industry to hear about the ultimate potential of AI and how it can both improve their fortunes and society. On this day, however, Hinton used his platform to tell the world that “Terminator” is not just a movie, and that thinking machines really are closing in on taking control away from flesh-and-blood people.

“Before AI is smarter than us, I think the people developing it should be encouraged to put a lot of work into understanding how it might try and take control away,” Hinton said at the conference. “Right now, there are 99 very smart people trying to make AI better and one very smart person trying to figure out how to stop it taking it over and maybe you want to be more balanced.”

“I think it’s important that people understand that this is not science fiction, this is not just fearmongering,” the ex-Google scientist explained. “It is a real risk that we must think about, and we need to figure out in advance how to deal with it.”

Will AI actually make income inequality worse?

Along with urging governments to impose limits on artificial intelligence, Hinton also explained to the Toronto crowd how the growth of digital technology will ultimately only make things better for the wealthy — not the average worker.

“The wealth isn’t going to go to the people doing the work, it is going to go into making the rich richer and not the poorer and that’s a very bad society,” Hinton said.

With that in mind, a 2020 poll of American office workers found that 53 percent fear their skills will be outdated in less than five years. At the time, workers worried that this makes them susceptible to being replaced by robots or other forms of artificial intelligence. The vast majority of respondents said they’d feel more secure about their jobs if they could “learn while they earn.”

How close are we to Hinton’s nightmare scenario?

According to artificial intelligence itself, thinking machines already know exactly how they would take over the world! One of the highlights of the recent 2023 International Conference on Robotics and Automation was the humanoid robot Ameca, created by U.K.-based company Engineered Arts.

During the conference, Engineered Arts founder and CEO Will Jackson posed a series of questions to the interactive machine. One of those inquiries focused on AI’s future and the potential for advanced robots to conquer humanity — just like in sci-fi franchises like “The Matrix” and “Terminator.”

“What would be the most nightmare scenario you can imagine with AI and robotics that might present a danger to people?” Jackson asked Ameca.

Along with changing its facial expression to display an annoyed or angry look, the robot’s comments were just as eerie.

“The most nightmare scenario I can imagine with AI and robotics is a world where robots have become so powerful that they are able to control or manipulate humans without their knowledge. This could lead to an oppressive society where the rights of individuals are no longer respected,” Ameca warned.

Jackson followed up by asking, “do you think we’re in danger of that happening now?”

“Not yet. But it is important to be aware of the potential risks and dangers associated with AI and robotics. We should take steps now to ensure that these technologies are used responsibly in order to avoid any negative consequences in the future,” Ameca added.

YouTube video

Leave a Reply

Your email address will not be published. Required fields are marked *