Wearables and Artificial Intelligence Extends Enterprise Capabilities

1475

Wearable Gesture Control

How are new technologies in wearables and artificial intelligence bringing automation to lift our productivity? This article explores advancement and application of artificial intelligence (AI) technologies — machine learning, deep learning, natural language processing (NLP), robotics and affective computing in the enterprise.

According to research compiled by Atlassian, based on data from EmailStatCounter, University of California, Microsoft Research, BBC News, EffectiveMeetings, and Salary.com, excessive emails, meetings, and interruptions sap our productivity at work. The average employee checks their email 36 times every hour and it takes 16 minutes to refocus after handling an incoming email. On average, employees attend 62 meetings per month, of which half of those meetings are considered wasted time. The average employee gets interrupted 56 times per day. In total, it takes 2 hours to recover from distractions per day. All this translates into 60 percent or less of work time is actually spent productively.

Wasn’t enterprise resource planning (ERP) systems suppose to make our lives easier? But even with all the system integrations, data warehousing, and abstraction layer, employees have to upkeep potentially hundreds of fields.

The office is about to get smarter. Cloud ERP systems will become even smarter, enabling higher productivity, organizational efficiency, and revenue generation. AI-based ERP systems, working symbiotically with smart wearables, will help identify business opportunities, profile meeting participants, take notes, update opportunities, and create action items on behalf of users. Employees will use intuitive gesture control rings and eye-tracking VR headsets to access information and to generate and manipulate content in the air. While computer vision, voice analysis, and physiological sensor data relayed to the ERP machine learning system can evaluate participants’ facial gestures, body movements, voice, and biometrics for non-verbal signals and congruency between what was said versus how they really felt.

 

Office Productivity

VR Headsets

Wearables offer hands-free ambient computing. Virtual reality (VR) headsets, including eye tracking capabilities are becoming a reality. Microsoft HoloLens blends the virtual world with real world. It really showcases how we are shifting away from being enslaved to our computers and desks and being able to perform work anywhere, tether-free and hands-free. In the same productivity category are Atheer Labs’s AiR Smart Glasses and Meta 1 developer kit.

Gesture-Based Control Rings

Ring ZERO by Logbar is a cloud control device that allows you to control presentations, smart things (garage doors, light bulbs), take pictures, send tweets, and play gesture games. Competitors include Nod, Fin, and 16Lab.

Brain-Computer Interface/ EEG Headbands

Emotiv’s EPOC & EPOC+ contextual EEG System and Neurosky Mindwave offer brain performance and emotional metrics, such as being able to measure and track focus, engagement, interest, excitement, affinity, relaxation, and stress levels. Mental commands such as push, pull, levitate, rotate and even hard to visualize commands such as disappear are feasible. EEG headbands can detect facial expressions such as blink, surprise, and smile.

Biometric Authentication Security

With Nymi, no more PINs, passwords and keys. Nymi supports persistent identity using the heart’s unique signature.

Better Posture

Lumo Lift tracks posture and gently vibrates when you need to sit or stand tall. It addresses major white collar, work-related hazards — herniated discs, back pain, and carpal tunnel from poor posture and extended periods of sitting.

 

Artificial Intelligence/ Cognitive Science

CRM

In the CRM category, with the $390 million acquisition of RelateIQ, Salesforce.com is working to integrate artificial intelligence into its platform to add predictive capabilities and to sort through big data. RelateIQ uses machine learning and deep learning to automatically pull and summarize information from email accounts, calendars, contact books and marketing automation systems to communicate easily with potential customers. The system highlights which prospects need your attention the most to close more deals. Startups with similar capabilities include Infer and Clari.

Task Automation

x.ai is an AI-based personal assistant that schedules meetings for you. All you have to do is include amy@x.ai in your emails and it will sort out people’s availability and schedule a meeting. Now what’s interesting is that when people receive an email from amy@x.ai, they respond with all the nice pleasantries thinking that she’s human.

EasilyDo Assistant saves time by handling your administrative tasks, such as calendar reminder to leave for an appointment, automatically dial-in to conference calls with pins, merge duplicate contacts, receive an alert for emails from a certain person, and other task automation.

Robotics

Robots will play a bigger role at work. Already widely utilized in manufacturing and heavy industries but they are making their way into homes and soon offices. Japanese robot Pepper has the ability to read and respond to emotions. Honda Asimo, marketed as the world’s most advanced humanoid is capable of opening things, holding objects, turning on a light switch, pushing a cart, carrying a tray, and kicking a soccer ball.

 

Natural Language Processing

Social Media

Facebook has acquired Wit.ai, a speech recognition startup that provides an API for building voice-activated interfaces and has over 9,500 developers on its platform. Wit.ai voice control will likely be integrated into Facebook’s native app or added as a voice-to-text input for Messenger. Others include Siri, Google Now, and Microsoft Cortana that lets you use your voice to send messages, make calls and set reminders.

Enterprise Computing

IBM Watson is the AI and natural language processing platform that won Jeopardy against human champions. IBM recently acquired AlchemyAPI. The acquisition gives Watson a critical machine learning technology and access to community of over 40,000 developers, who are building cognitive apps. AlchemyAPI augments Watson’s ability to identify information hierarchies and to understand relationships between people, places and things across structured and unstructured data. It also provides a key missing capability for Watson — visual recognition technology that can automatically detect, label and extract important details from images.

3D Avatars

Geppetto Avatars bring virtual avatars to life to enable computer-human interaction using natural language processing. It combines 3D avatar animation, artificial intelligence, natural language processing through IBM Watson, sound and image analysis to provide avatars capable of listening, having contextual conversations and responding with gestures. Current vertical application is Sophie, a NLP avatar that triages your health care needs. She is capable of listening to your health concerns and searches against its knowledge repository to find helpful answers.

 

Affective Computing

Human beings are sentient — we are able to perceive, experience and feel. Whether we are talking about health, retail or marketing, facts and figures alone do not drive decisions. Understanding how we feel is often the driver to our decisions, behaviors, habits and attitudes. Amyx+ will debut its affective computing platform in Fall. It’s capable of detecting and classifying emotions, cognitive load, context, and stimuli to present meaningful and actionable insights for users and brands.

There are numerous applications that range from campaign management, brand engagement, live events, retail, customer service to healthcare.

Campaigns

Campaign triggers can now be based on emotions in addition to geofence tripping, social media or other common methods.

Live Events

When an attendee registers and opts in at a live event, sponsors and organizers can send campaign messages, unlock rewards, promotions or ads based on emotions. For instance, at a car show, the AI-based, affective computing system can tell when an attendee responds favorably to a particular make and model. That means during or post event, the organizer can target the right car for the right person.

Retail

From Louis Vuitton to Gap, when customers log into the store’s Wi-Fi and accept the privacy terms, we can determine when someone responds emotionally to a particular handbag or a piece of article. This could then trigger a special message or promotion tailored to that shopper. It’s not only knowing what they love but what do they love that they can afford.

Hospitality & Customer Service

Be it hotels, restaurants, cruises, airlines or any service provider where customer service is important, the emotion sensing platform can determine if someone is having a positive or negative customer service experience. Rather than sending a JD Edwards customer satisfaction survey once a year, in real-time any negative customer service interactions can be brought to the attention of the staff for immediate correction and resolution. It also helps the organization to determine the optimal product and service offerings to bring up these potentially low areas of customer satisfaction.

Healthcare

Patient care assessment, from hospital visits to blood draw, the cognitive science platform can determine the emotional reaction of patients to understand their level of satisfaction, automatically.

The system can work symbiotically with drug compliance technologies such as GlowCap and eCap to determine if a patient hates taking that medication.

For medical advice, the intelligent system can measure a patient’s emotional reaction to the doctor’s advice. For instance, the doctor suggests consuming more raw, organic vegetables in lieu of red meat and also start taking cholesterol-lowering drug. The data analytics platform can measure how that advice was received emotionally by the patient and determine how likely that person will comply with the medical advice.

Wearables working in conjunction with Amyx+’s neuroscience platform inform wearers about their anxiety and stress levels, mental focus, and general productivity. The system can inform the user to take specific steps to optimize their cognitive load and emotional moods, e.g., caffeine, exercise, neurochemical supplements.

 

Conclusion

We are at an exciting time. From the office, enterprise systems to customer interactions, wearables and AI are bringing incredible new capabilities to the market. Smart systems will help automate many manual processes and tasks to increase worker productivity.

What are practical applications of these technologies in the near-term? Enterprise systems with AI, deep learning, and NLP will be able to take notes, identify which leads are most likely to convert, update the CRM for you and create action items. We’ll spend less time inputting data and performing manual tasks.

AI, machine learning, and deep learning will continue to become the bedrock of decision support and actionable intelligence.

The fields of affective computing is enabling us to measure the emotional reaction of customers with horizontal applications across sectors and use cases.

Companies like IBM, Facebook, Salesforce.com, and others are spending millions of dollars to acquire these new technologies. They’re not waiting. The reality of smart enterprise computing isn’t a distant dream; it’s happening now. So you can play it safe and wait until everyone adopts and miss the boat entirely or you can take an active role in shaping the future. The decision is yours.

Originally published on Examiner on May 23, 2015. Author Scott Amyx.