Geoffrey Hinton is one of the most influential figures in artificial intelligence. In 2012, he helped develop technology at the University of Toronto that revolutionized machine learning and led to breakthroughs in AI. However, a recent New York Times article relays that he now fears that this technology is moving too fast and causing more harm than good.
Hinton recently resigned from Google, where he worked for over a decade, so he could speak openly about the risks of AI. He worries that AI systems are being deployed irresponsibly by tech companies in a rush to profit from this new technology. He says that advanced chatbots and generative models can generate misinformation and manipulate people. As they continue to advance, they also threaten to disrupt jobs and potentially humanity itself.
Until recently, Hinton believed that Google and other companies were acting as ‘proper stewards’ of AI, ensuring it was used responsibly. However, he now thinks they are recklessly competing to deploy AI for commercial gains, rather than prioritizing safety. There are no regulations or guidelines in place to control this technology and prevent catastrophic consequences. While many dismiss these fears as hypothetical, Hinton thinks the risks are very real and already materializing.
Hinton spent his career developing AI and believes in its potential benefits. However, he feels that it is progressing at an unsustainable rate and without proper precautions or understanding of the possible downsides. He argues for slowing down progress, conducting more research on controlling advanced AI, and promoting global cooperation – not competition – on this issue.
As the ‘Godfather of AI,’ I think Hinton’s warnings should be heeded. We must ensure that AI progress benefits humanity as a whole, in case AI moves quickly beyond our control and comprehension. Hinton’s warning is a sobering reminder of the importance of thinking through the long-term impacts of our innovations instead of chasing short-term gains. It seem sensible to proceed with caution and care in this case.
Comments on this entry are closed.