Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Jack Schofield

What if your laptop knew how you felt?

The Christian Science Monitor has a report about some ongoing reserach by Rosalind Picard's Affective Computing Group at MIT:



"Mind Reader" uses input from a video camera to perform real-time analysis of facial expressions. Using color-coded graphics, it reports whether you seem "interested" or "agreeing" or if you're "confused" about what you've just heard. (You can read more about Picard and postdoc researcher Rana el Kaliouby's project in detail on MIT's website).





The system was developed to help people with autism read emotions, as they have difficulty decoding when others are bored, angry, or flirting. Their lack of responsiveness makes them seem insensitive to others. Ms. Picard's team uses cameras worn around the neck or on baseball caps to record faces, which the software can then decode.



The second page of the story looks at other systems.....

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.