Smartphones are getting smarter. Just as we're getting used to telling them what to do with our voices, now they can read our facial expressions and work out what we're feeling, opening up a world of possibilities.
"We're all about bringing emotional intelligence to our digital world," says Rana el Kaliouby.
Dr el Kaliouby and the team at MIT spin-off Affectiva recorded the facial expressions of more than 3 million people in 75 countries to develop emotional recognition software.
"We are able to read about 15 facial expressions," she says. "These combine to create or portray eight emotional states: happy, sad, fear, anger, disgust, contempt, confusion, surprise."
So far, the primary money maker for the app is audience testing of commercials, programmes and movie trailers.
Fourteen-hundred brands use it to find out frame-by-frame what's funny or sad or spellbinding or boring.
Dr el Kaliouby is working to expand the app for use in the mental health field for depression, and to bring emotional interactivity to every day.
"We envision a world where all our devices have an emotional chip," she says. "Things like your car, your fridge, your mirror – where they're all emotionally aware and they adapt to your emotions in real time.
"I do recognise that there's going to be abuses of this technology but I do believe that the good that can come out of this outweighs the abuses and we're pushing on the good."