Alexa Learns to Talk Like a Human

techcrunch.com

Sarah Perez:

These new tools were provided to Alexa app developers in the form of a standardized markup language called Speech Synthesis Markup Language, or SSML, which will let them code Alexa’s speech patterns into their applications. This will allow for the creation of voice apps – “Skills” on the Alexa platform – where developers can control the pronunciation, intonation, timing and emotion of their Skill’s text responses.

A clever and important step towards more useful, natural, and ubiquitous vocal computing. Amazon continues to lead the pack here...

Tip: if you're not using the "Alex" voice for speech in iOS, you should be.

Read more...
Linkedin

Want to receive more content like this in your inbox?