By Bill Bennett, Enterprise Architect
Applications may soon be better able to “get it,” or understand how you are feeling. Microsoft recently announced a new group of technologies called Project Oxford (www.projectoxford.ai/). What makes the announcement interesting is how it takes machine learning to a higher level. Developers will create smarter apps, helping users get their point across in an efficient yet subtle manner—just like they would in a face-to-face conversation.
There is a lot of technology out there today that is similar to, but not quite as sophisticated as, emotion recognition. Today we have face-recognition technology that can automatically distinguish faces in photos, group faces that look similar and know when two faces are the same. This tech has been popping up everywhere, from websites that can guess how old you are based on your photograph (www.how-old.net) to sites that can recognize and rate your facial hair in an effort to raise charitable donations. Facebook has also put facial recognition to use: People’s faces are now automatically picked out of the crowd in uploaded photos, making tagging easier. The Windows Hello feature of the new Windows 10 operating system is the latest practical application of these capabilities. With this new software, members can now use the image of their faces to automatically log in, instead of having to remember that old username and password combination.
This type of technology isn’t just limited to facial recognition. New speech processing tech, like Apple’s Siri and Microsoft’s Cortana, recognizes speech and translates it into text and then back again. Today’s language interpretation technology, like Apple’s Autocorrect or Amazon’s Echo, enables applications to understand what users mean when they say or type something in natural, everyday language. Systems then get better at predicting what the user wants based on experience, and they figure out what the user wants them to do.
But a unique future plan is to take things one step further and develop technology that can recognize emotion. The idea is that any application would be able to recognize eight emotional states—anger, contempt, fear, disgust, happiness, neutrality, sadness or surprise—based on universal facial expressions that reflect those feelings, and then gauge people’s reactions to a situation.
Although Microsoft is not alone in starting a conversation along these lines, it can only help to spark innovation in how technology is used to benefit the members in better, faster, and less expensive services. Most of the examples above could always be a potential hit or miss as the technology matures and faces the critique of the marketplace. Only time will tell how this new idea will eventually evolve and what new in-demand gadgets can be created that will bring even more value and efficiency to today’s modern culture.
*The content provided in this blog consists of the opinions and ideas of the author alone and should be used for informational purposes only. VyStar Credit Union disclaims any liability for decisions you make based on the information provided.