Intel’s RealSense project now enables its tech to feel
You had a bad day, the camera don't lie. And before we continue lyric singing more of the song, here’s why.
Intel’s been speaking of its RealSense technology for a few months now, so you should already know that the technology’s capable of recognising individual fingers, depth, and facial expressions.
But a new development is suggesting that the RealSense 3D cameras, amongst these other things, can read human emotions. Now, we’re not sure about you but what first springs to our minds is Sonny from I, Robot (the android that could feel). Imagine Intel computers doing the same.
BetaBeat claimed that by analysing the shape of the user’s eyes, lips, and cheeks, it would be able to determine if you’re happy, disappointed, or even aggravated.
RealSense recognises more than it should
“People have 3D sensing, so the robot should have 3D sensing like us. It will recognise you, read your emotions. ‘Why are you sad today? Should I sing you a song?’ The future is crazy,” Intel CTO of perceptual computing, Achin Bhowmik, was quoted to have said.
This is a massive development from what we initially thought it was – an attempt at plugging in and toying with Xbox Kinect on a desktop. Now, the project takes 3D to a totally different direction and pushes the boundaries with what can be achieved on a desktop.
In addition to getting all feely, the technology will also remove backgrounds, recognise 10-finger gestures, and perform 3D scans, thanks to its 1080p resolution, high quality depth sensors.
If you’re just as intrigued by it as we are, check out Intel’s dedicated YouTube channel for RealSense.