Los Angeles: Canadian filmmaker James Cameron, who won the Oscars for “Titantic” and “Avatar”, has expressed fear about an arms race relying on AI saying it is a dangerous thing.
In an interview with Rolling Stone to promote the book release of “Ghosts of Hiroshima,” which he plans on adapting into a movie, he said that he’s aware that his “Terminator” franchise could very much become a reality if AI gets into the wrong hands.
“I do think there’s still a danger of a ‘Terminator’-style apocalypse where you put AI together with weapons systems, even up to the level of nuclear weapon systems, nuclear defence counterstrike, all that stuff,” he was quoted as saying by Variety.
“Because the theatre of operations is so rapid, the decision windows are so fast, it would take a super- intelligence to be able to process it, and maybe we’ll be smart and keep a human in the loop. But humans are fallible, and there have been a lot of mistakes made that have put us right on the brink of international incidents that could have led to nuclear war. So, I don’t know,” he said while expressing his apprehension.
“I feel like we’re at this cusp in human development where you’ve got the three existential threats: climate and our overall degradation of the natural world, nuclear weapons, and super-intelligence,” he continued.
“They’re all sort of manifesting and peaking at the same time. Maybe the super-intelligence is the answer. I don’t know. I’m not predicting that, but it might be,” he added.
“The Terminator” franchise, which was kicked off in 1984 with Arnold Schwarzenegger in the lead role, is set in a world where an artificially intelligent defence network known as Skynet has become self-aware and has conquered humanity.
Asked about AI's role in Hollywood, Cameron raised doubts whether AI could ever come up with a “good story” that could replace screenwriters, though it could help assist directing blockbuster movies.