Google Glass can now detect emotions with this app
If you can’t tell that a person is happy by the smile on his/her face or unhappy by the tears streaming down their cheeks, here’s Google Glass to the rescue. It’s now got a new app that helps you detect a person’s emotional state.
This new Google Glass app is named Shore, and in addition to making us think that humans have become robots to need tech to tell them how someone feels, it conducts real-time face detection and analysis.
Created by Fraunhofer IIS, the app makes use of an analysis software to work collaboratively with Google Glass. It also determines the person’s age and perceives their gender, in addition to other things except identifying the person.
But apparently, according to the creators, the purpose of the app isn’t for anyone and everyone. Fraunhofer IIS stated on its website that it’s actually for those who have trouble identifying emotions by seeing another’s facial expressions, like someone with autism or any similar disorders for example.
Shore is functional..
“It is the first emotion recognition software in the world to function in real-time with Google Glass. This opens up an entire spectrum of new smart eyewear applications, including communication aids for people with disorders,” it claimed.
What this communication aid does is display this real-time analysis in the person's field of vision on Google Glass, and if a visually impaired person uses it, they can choose to have the information fed to them via supplementary audio.
Fraunhofer IIS hopes that the application will create a positive impact on people and highlights that it can also be integrated into other applications such as market research analyses or even interactive games beyond on just Google Glass.
[Source and image: Fraunhofer IIS]