Is Stephen Fry’s warning about an AI ripoff of his voice just the beginning?

Artificial Intelligence (AI) has been a controversial topic in Hollywood lately, and it seems that actor and author Stephen Fry has become a victim of its power. At the CogX Festival in London, Fry revealed that an AI version of his voice was used in a documentary without his consent or knowledge.

Fry played a clip of the AI system mimicking his voice and narrating a historical documentary. “I said not one word of that—it was a machine. Yes, it shocked me,” Fry said. The AI system used his readings of the Harry Potter books to create a voice that accurately imitated his own.

As the UK audiobooks narrator for the Harry Potter series, Fry was understandably taken aback by the AI’s ability to replicate his voice. “What you heard was not the result of a mash-up, this is from a flexible artificial voice, where the words are modulated to fit the meaning of each sentence,” Fry explained.

More concerning to Fry is the potential misuse of this technology. He warned his agents that this is just the beginning and that full deepfake videos will soon be just as convincing as the AI voice imitation. “It won’t be long until full deepfake videos are just as convincing,” Fry emphasized.

Fry, who has acted in notable films such as Gosford Park and V for Vendetta, sees the current Hollywood strikes as an attempt to address this potential problem for actors. He compared AI to the first automobile, acknowledging its impressiveness but highlighting that it is still a work in progress. “What we have now will advance at a faster rate than any technology we have seen,” Fry stated.

In conclusion, Fry’s experience with the AI ripoff of his voice serves as a warning about the potential dangers of AI technology. While it can be used in creative and innovative ways, there is a risk of misuse and infringement upon an individual’s rights. As technology continues to evolve at an unprecedented rate, it’s safe to say that we are indeed living in a “f***ing weird time.”

Share this article: