Skip to content

If you use the phrases ‘hey Siri’ and ‘ok Google’ at least once a day, then it’s safe to say you already have a ‘relationship’ with AI.

 

Even if you are just asking it basic things (see how I didn’t use any pronouns there) – to set an alarm, to check the weather, or find out who invented the question mark – you will by now have become familiar with those voices. In small doses, and for functional tasks and simple information exchanges, I think most people are comfortable listening to a voice that isn’t real.

 

 

But what about if that voice were reading you a story?

 

If you are, like me, one of the millions who subscribes to audiobook services such as Audible, then you will appreciate how the quality of the listening experience relies as much on the person reading the book, as the book itself.

 

And when I say person, I am not just talking about their ‘voice’ in terms of its technical or audio qualities, I am talking about the fact that there is a person behind it, and that person helps create an emotional connection with the words being spoken.

 

Artificial Intelligence can read words. People tell stories.

 

But does everyone feel the same?

 

Perhaps not, or at least that’s the fear of voice actors all around the world (the recent strike in Hollywood was largely to do with the threat of people being replaced by AI, especially voice actors.)

 

The Australian Association of Voice Actors (AAVA) recently told a parliamentary committee investigating AI that the jobs of an estimated 5,000 local voice actors are already in danger, with the group pointing to one national radio network actively investing in technology to replace human voice actors.

 

One area particularly at risk is that of audio books, previously a rich source of work. But with AI ‘clones’ of voices becoming so realistic, human voice actors are at real risk of being replaced, as they slowly are in radio and corporate roles. The volume of work in audiobooks means publishers may be attracted to the large potential savings.

 

While Scarlett Johansson was able to force Open AI to stop using an eerily accurate clone of her voice in ChatGPT, the vast majority of humble, hardworking voice actors are unlikely to carry as much clout.

 

 

Stories about voices being ripped off are now legion.

 

Vocal cloning company Voicify claims to offer over 3,000 deepfake voices that replicate those of artists, including Adele, Justin Bieber, Phil Collins, Eminem, Ariana Grande, Michael Jackson, Bruno Mars, George Michael, Elvis Presley, Prince, Tupac Shakur, Ed Sheeran, Taylor Swift, and Amy Winehouse. Needless to say they’ve been sued a few times.

 

 

And it’s not just to voice songs or ads or videos that clones are being used. Think of what else a faked voice can be used for. In one high profile case, an artificially generated voice purported to be a well-known BBC presenter, and was used to fraudulently grant her permission for her likeness to be used in advertisements for insect repellent!

 

Of course some companies who use AI voices are taking an ethical approach and paying voice actors a fee to licence their voice, and then pay royalties when clones of their voices are used, but one can’t help but feel the royalties being paid are just a fraction of what would previously have been the case?

 

There is also a much darker side to AI voice cloning, with fake voices powering countless scams, causing political unrest and even being used as a ransom tool (in a kidnapping hoax).

 

 

AI is the hottest conversation in advice

In a segue so clunky it proves this piece was written by a human, this brings me onto the topic of AI more broadly, and where it currently fits in advisor conversations.

 

AI is, without question, the hottest topic for advisors right now. Whether it be how to use it in an advice practice, or how to invest in it, most content about AI rates its socks off. I say ‘most’. The downside of topics being hot is that everyone wants to join the conversation, and when you get more voices, the conversation can be a little crowded, which makes it harder to be heard.

 

Blue Ocean conversation mapping

At Ensombl, we see this phenomenon unfold in our blue ocean conversation mapping. By measuring both the frequency of, and engagement with, various topics being discussed by users of our platform, we can see which conversations are interesting but crowded, and those where there is a little more space. Understanding this can help you drive up the ROI of your efforts by zeroing in on those topics where content will be highly appealing, and not have to work as hard to stand out.

 

We’ve mapped over four years of advisor conversations this way. The results are fascinating, and were published in our most recent ‘What Advisors Want’ report.

 

If you want help making your advisor facing content more relevant and thus more responsive, read the report, then get in touch. We’d love to help.

 

 

 

 

More from Richard Dunkerley