The new discovery uses AI to break down sentences into sounds and tones. According to Five9 their new technology will bring big savings to companies on personnel costs.
The key here is using a human voice to train the AI. Callan Scheballa, a project manager with Five9, explained:
“At the end of the day people are going to understand that they are talking to a machine, they are going to understand that they are talking to software.”
He added:
“Yet, the voice that it uses, there’s no reason for it to not sound great. Now, the more. lifelike that it can sound, at least in our experience, the better the reception of the customers that are going to be talking to it.”
Five9 auditioned actors in London and decided on Joseph Vaughn to record a series of scripts for the company, in turn to find their latest voice.
That audio was then broken down into sounds and tones rather than words. It’s enabling the AI program to recreate not only sentences but also distinct moods.
The software is trained to detect word combinations and tones in the caller’s voice so that it can respond to their emotional state as well. Rhyan Johnson, an engineer from Wellsaid labs who is involved in the project, said:
“We are capturing all of the audio data and all of the combination of frequencies and vibrations that are inherent to a voice, that as a human we would recognise is a voice, but the machine just is guessing sounds.”
The company says their AI agents have already answered more than 82 million calls for healthcare providers, large retailers as well as insurance companies, banks, local businesses, and state and local governments.
The new Virtual Voiceover tech will be available next year.