Fast and Furious: Why Millisecond Wearable Interactions Will Have You Wanting More

1178
Wearable computing

Wearable computing will fundamentally change the user experience. Future wearable apps will become more anticipatory and less user initiated. Rather than requiring input from users, these wearable apps will be aware of situations and states, presenting contextually useful and timely piece of information, in the least intrusive manner.

Moreover app interactions will go from minutes to seconds — a flash of information, then the interaction is over until the next event. It will deliver, high value information in snippets as you go about your day. It will always be on in the background, listening and making sense of the your context and activities.

Mobile and web design best practices won’t be enough to build compelling wearable apps. Developers will need to think entirely different, from wearable app design to low-energy consumption app architecture that supports always-on mode. App designers can no longer assume that the de facto user experience starts with a user turning on an app and then proceeding through a pre-defined UI workflow.

For reasons of small form factors, locomotion, task orientation, immediacy, smarter system expectation (greater degree of automation, integration and computing power), and multi-device platform interactions, users will come to expect their wearable apps to anticipate their needs and to proactively present relevant information to them.

To better grasp why wearable interactions would come in short bursts, lasting but for a few seconds, let’s explore couple of use cases to help illustrate. (Note that some gaming, entertainment, and productivity apps will continue to be on-demand, with longer engagement segments.)

You’re window-shopping at a mall. Based on your geolocation and proximity to stores, you receive in-store deals on your smartglasses via an iBeacon-like app. Based on your eye-tracking movements and length of gaze at a particular fashion item, product highlights appear on your field of vision. The app might even offer complementary pieces to go with the outfit. At no point, did you have to stop to pull out your smartphone and launch an app; the wearable experience is seamlessly integrated into your shopping experience with minimal distraction.

For a weekend getaway, you decide to travel to Carmel-by-the-Sea near Pebble Beach. You and your significant other stroll through downtown, checking out the eateries, retail shops and art studios. Your smartglasses app notices that you have stopped in front of a restaurant to read the menu. It automatically flashes the restaurant’s reviews and ratings with photos on your screen. A few minutes later, the system recognizes that you’re walking away from the restaurant. Leveraging your past dining history and preferences, the app suggests an Italian restaurant around the corner with a higher rating that happens to be famous for it’s Scialatiella Alla Pescatore, one of your all-time favorites.

In these examples, you never launched an app or initiated a command. Though the app interaction is brief, behind the suggestion is a powerful recommendation engine.

Wearable apps of tomorrow will handle data analytics using BigQuery, Hadoop, Monte Carlo simulations and machine learning models to turn terabytes of sensor generated data into meaningful, actionable insights. What you, the user, experience is relevant, timely information at the zenith of its value. As the system learns over time, its recommendation becomes more relevant and on target.

Under the Hood of Tomorrow’s Wearable Apps

Data collection. Wearables represent not one but an ecosystem of wearable computers with embedded processors and multi-sensor laden devices (SLDs) that make up the nervous system, where the smart computing units represent the central nervous system or the brain and the SLDs, the peripheral nervous system that coordinates voluntary and involuntary actions and transmits signals between different parts of its nervous system.

The sagacious adage of good data drives good data analytics is amplified with wearables. Depending on the specs of the wearable hardware that your app is built on, you have access to a myriad of sensors – geolocation, accelerometer, gyroscope, magnetometer, pressure, altimeter, temperature, electrodermal response and security and health biometrics. Your wearable app has to take these disparate data points from sensors as well as structured and unstructured data from other services to capture holistic data on users to construct their behavioral, cognitive, social, and health profiles. These models will then drive your predictive, recommendation engine.

A step in the right direction is Samsung Architecture Multimodal Interactions (SAMI), an open sensor data platform to aggregate health data from Samsung (Simband) and non-Samsung SLDs that give developers API access to granular sensor data to run analytics. On the heel of this news, Apple is expected to announce HealthBook, a health-tracking platform, next week.

Big Data with Machine Learning

The core IP of your wearable app will be the actionable intelligence derived from big data. Fortunately, you have at your disposal powerful, scalable cloud managed services and open source tools like Hadoop and Spark to control, configure and run deep analytics on both streaming and historical data to yield insights into patterns and anomalies. Yet, your secret sauce won’t be the tools, but rather your methodology, refined overtime.

Big data analytics with machine learning requires building predictive and prescriptive models through model training. Using statistical pattern recognition, predictive analytics is able to discover patterns in historical data that can signal what would happen next. Techniques such as neural networks, linear/ logistic regression, support vector machines, scorecards, decision trees, clustering, association rules, K-nearest neighbors, and naive Bayes classifiers can be used to build predictive models, including running multiple models.

As AI learns overtime, algorithms need to take an iterative approach. Lean analytics starts with a use case and then actualized through a methodology to source and integrate; prepare, cleanse and enrich; analyze; visualize patterns and then deploy into production.

Your users will never see this complexity. What they will experience is the benefit from the sophistication.

In summary, UX is about to undergo a pivotal shift, away from the user demand model of the Internet and mobile apps to low-viscosity app interactions, powered by real-time data models. The net effect for users will be that their wearable user experience will be…

  • Contextual
  • Smart
  • Relevant
  • Helpful/ actionable
  • Timely
  • Specific
  • Simple
  • Efficient
  • Welcomed

Originally published on Wired on May 29, 2014. Author Scott Amyx.