2013-08-05

Co.Labs

Emotion-Sniffing Is The Next Bizarre Trick Your Phone Is Learning

While hardware can already do a good job of interpreting human emotions through facial recognition, it turns out that your smartphone may be great at tracking them long term. How can you use that data? And do you really want it?



Rice University scientists, working with Microsoft Research, have created something rather surprising--a long-term-mood detecting device that can ascertain a user's emotions with up to 93% accuracy. What's more, there's no EEG device in use here, no blood pressure sensors. It's all down to simple smartphone data analysis.

The MoodScope system is a two-parter, with a smartphone acting as a data-collection device and a cloud-based model. Since we take our smartphones everywhere and increasingly use them to surf the web, engage in social media, use apps and so on, they are a fabulous tool for this purpose. MoodScope takes note of how users contact their friends, where they are habitually located, which apps they use, and what sites they visit. It turns out that phone calls are most strongly correlated with mood (often positive moods) and apps are the second strongest.

The app runs only intermittently so it doesn't eat up too much power, and builds a 1MB-sized log file that gets uploaded every night.

The cloud aspect of the app builds up a mood model of the user's habits, adjusting them based on incoming data and actual user-generated reports on their mood state. The locally installed app can make these assessments too, but these wouldn't necessarily be calculated using the most updated algorithm. Early on the estimate of the user's mood hits about a 66% accuracy, and this rises as the algorithm is refined--accurately tapping whether the user is in one of several states like "relaxed," "bored," or "excited."

The team behind the app notes that it's not perfect, and everyday life is jammed with almost unmeasurable and unpredictable events that can influence mood beyond the detection limits of the app. But it's confident that in time it will be able to improve the accuracy of the mood-sensing algorithm, particularly if it includes extra measurements like monitoring audio events for stress or other fingerprints. This probably would be battery intensive, but there's already a number of devices that will listen to your every word in order to watch for a command phrase.

But what exactly is this data for? At heart it's simply an extension of the quantified self, expanding measurements from strictly physical properties like exercise or location, which are pretty simple to collect, to more esoteric properties.

The Kickstarter project Melon is one great example of a less tangible data collection system--it's designed to measure "focus" of its wearer by correlating a user's attention and the activity they're taking part in via a simple EEG sensor. The idea is that by keeping track of their "focus" as assessed by the sensor and its associated app, the user will become aware of how their activity is helping them pay attention. It's easy to imagine a system like MoodScope being used for similar self-awareness assessments of all sorts of different potential types, bearing in mind that during an unexpected event a user is less likely to be good at assessing their mood compared to an automated system that's learned their habits.

There's also the intriguing idea that automated mood tags could become an interesting new shared label on social media services like Twitter or Facebook. Considering how we all use amusing hashtags on Twitter already, and even Instagram our dinner or important life events and share them with the world, this isn't too difficult or outlandish a scenario to imagine. This is another novel use of smartphone-based mood-sensing devices, although it's hard to imagine that it would be easy to monetize.

More monetizable is the idea of mood-adapting services or third-party apps. Existing apps like Spotify already try to offer mood-appropriate playlists based on the whim of the user, and this sort of service could be better based on a real-time adaptive response. Gaming is another obvious area where automated mood sensing could be very powerful, where games introduce different elements based on what the user is feeling (and this is something that Microsoft is looking at for its next-gen Kinect).

The practical upshot: Over the next year or so be prepared to think about how users are feeling when they use your app, and working out how you can use this data to your advantage. Right now this is a research topic, but all it would take would be a player like Apple or Samsung (which loves bolting in extra features to its devices) to really push mood sensing by smartphones into the limelight.

[Image: Flickr user Tauno Tõhk]






Add New Comment

1 Comments

  • vint

    I hadn't thought of applying it to games yet, which is odd, as I've once spent quite a while trying to get webcam facial recognition, to emotion deduction, to my avatar's facial animations to work. I can even imagine a game would be able to change my mood for the positive: a dark, gloomy cloud-packed scene, where suddenly a few rays of light break through. All of a sudden, the sun is shining, there's nice flowers.  Or, more Black & White style, that your mood information gets sent to the NPCs, and if you are grumpy, they'll be less friendly too.