LAS VEGAS—This week at CES, a MacBook analyzed my speech and told me I was feeling happy, confident, and convincing. I'd like to think that was accurate. Minutes later, when one of my TechHive colleagues and a developer got into an opinionated discussion, it picked up feelings of anger and assertiveness—which was probably also accurate.
We were trying out Beyond Verbal, a cloud-based platform that reads emotions in real time by analyzing voice intonations when someone is speaking. Its Web engine, called Moodies, can extract more than 400 variants of moods just through voice alone. If you allow Moodies to access your computer's microphone, it will listen to you talk in 20-second spurts, and then present a breakdown of what's really going through your head.
The system breaks readings down into two categories: Your primary mood, which is the most expressed emotion, and your secondary mood, which is less expressed but still present, kind of like the hidden intention behind what you're saying. What's super interesting is that it's language-agnostic, meaning that it focuses on tone, inflection, and intonation instead of on the words themselves.
ConversionConversion EmoticonEmoticon