The Matrix A Matter of Wear and When

1319

Shots Magazine The Matrix

Shots Magazine October 2014 Issue

Scott Amyx of Amyx+ Interviewed by Adrian Pennington

In his 2013 novel The Circle, author Dave Eggers depicts a not-too-distant future in which a tech giant monopolises computing, from search to social networks, and accesses every digital beat of millions of people’s lives. ‘The Circle’ believes it is a democratising force unlocking creativity and freedom of choice, but when personal lives are broadcast online Truman Show-style, when private actions are policed by public forums, and where no record is erased, its totalitarian surveillance is unambiguously Orwellian.

Eggers’ book touches on a question much debated within Silicon Valley: who owns our personal data and what should be done with it, as more and more is collected from the micro-computers that are set to pervade our lives? In Eggers’ scenario, too much data in the wrong hands closes off choice and harms our wellbeing. It’s aimed at Amazon, Google, Facebook and others who contend that data can be farmed, curated then targeted to benefit an individual’s existence, rather than benefiting the corporation.

That sense of life enhancement from product is of course the same trick that advertising tries to pull off and it’s why there’s considerable buzz around wearables, even if the jury is out on exactly what its communications potential is. “Wearables are still in digital snake-oil territory,” says Aaron Martin, head of strategic services at digital communications agency Collective London. “They have massive potential but brands are in the phase of doing things for the sake of it, rather than for the benefit of consumers. We’re asking consumers to adapt their behaviour to technology rather than respecting human behaviour and getting technology to adapt.” Wearables are different to previous mobile technologies because they give us access to information about our physical bodies and the physical environment we inhabit in real-time. As Evangeline Marzec – now co-founder of a wearable tech startup and until recently the mobile strategy specialist at Deloitte Digital – observed in her blog, wearables’ primary purpose “is to support immediate, real-world actions by providing relevant, contextual information precisely at the point of decision-making.” An example: the use of GPS tracking software in pro sports kicked off with sabermetrics, in which statistical data of baseball players’ performance is gathered and analysed – it was famously used by baseball team Oakland A’s manager Billy Beane (immortalised in the film Moneyball). The next stage saw the British and Irish Lions rugby team have sensors stitched into players’ jerseys to help managers analyse team performance and to make decisions on replacements in real time.

Let your glasses tell you what to eat

It is the contextual awareness offered by light and temperature sensors, magnetometers, gyroscopes, barometers, altimeters and accelerometers, sensors for geolocation, electrodermal (skin) response, security or health biometrics that gives “brands an opportunity to truly integrate into every facet of life and deliver high value interactions,” enthuses Scott Amyx, founder and CEO of wearables digital agency Amyx+.

Brands can insert themselves in infinitesimal non-invasive ways into our lives. Heartbeat monitoring via wristbands may trigger dietary suggestions to your Google Glass; Microsoft’s Septimu earbuds can monitor wearers’ moods by measuring heart rate, temperature and biorhythms and, together with the app Musical Heart, choose the best type of music to play to them. Ben Jones, chief technology officer at AKQA classifies wearables as a separate communications channel and says the key is to marry utility and message. AKQA’s work includes designing the fabled immersive Oculus Rift world for Nissan and it’s the lead digital agency for wristband micropayment device Barclaycard bPay: “We are giving the bank’s customers the chance to simplify their lives in an immediate, relevant and positive manner,” he says.

Brands who’ve entered into the wearables space are doing so, by and large, for the prestige of association. Media players from ELLE magazine to The New York Times, for example, piled into Glasswear apps. There’s no harm in doing so, especially if treated as experimentation. CNN, for example, is exploring whether Glass enables reporters to broadcast live from the field, and down the line, whether multiple Glass-captured recordings of an event can be united to provide a new panoramic perspective on what actually happened.

The apparent popularity of devices like fitness bands Jawbone UP and Nike+ FuelBand, smartwatches Galaxy Gear and Pebble and smartglasses from Vuzix, Google and Epson (Moverio) prompt wild speculation as to the market’s worth. Researcher Visiongain estimates the sector is worth $5.24bn this year, while analysts forecasting 2018 figures range from Juniper Research predicting a global market of $19bn and IHS predict a whopping $30bn on annual sales of more than 180 million devices. The truth is, this is such a nascent market that no one knows how big it may become.

Nonetheless, any of those stats are remarkable enough for a fledgling consumer electronics category, until you contrast it with smartphones, of which we will buy 2.3 billion worldwide in 2018, according to IDC. Wearabletech is tiny. So why all the attention?

“It feels like [wearables] should be important even if nobody’s figured out the killer app,” says Mark Avnet, dean of digital agency 360i’s education hub. He feels that genuine wearables, like haptic sensors embedded in jewellery or clothing will provide a more seamless and fruitful set of interactions. “We sweat into them, we customise them,” he says. “There’s a physical intimacy to wearing them that feels real in an age of abstraction. It’s personal, although its potential is not yet realised.” Examples of the more intimate nature of wearables include the T.Jacket, which enables parents to ‘hug’ their kids via mobile devices; the Tactilu bracelet that responds to and delivers touch remotely and Fundawear, developed for Durex, via which lovers can phone in their foreplay by controlling their partner’s underwear long distance.

A more cerebral application, the Neurocam by Neurowear offers a wearable camera system that hooks a brainwave detecting headset to an iPhone to identify what the wearer is interested in and automatically records and saves the footage in five-second GIF clips. One implication of this, according to researchers PSFK, could be the creation of ‘highlight reels’ from a day or social event. Another, in a decade or so, could be the cataloguing of entire personal experiences to a ‘memory cloud’.

The focus is already switching from hardware to software and the glue that unites these wearables together. It’s one reason why Nike and lark, which manufactures activity trackers, recently ditched their wearable hardwaremaking divisions to focus on researching and developing software that other hardware makers can integrate into their own wearables. “Over time we’ll see single-purpose devices such as sensor bands give way to multi-purpose Swiss Army Knife-style devices,” says Amyx. “Hardware will consolidate.”

AKQA created NikeFuel Guides for FuelBands and is now helping the sports brand redesign its API [application programming interface]. “The future is one of integration,” says AKQA’s Jones. “It’s about making sure that wearable devices connect seamlessly.” Promising far more innovation and potential than the individual devices themselves, adds Amyx, “are the
value-added services that can be wrapped around them.”

Building the brave new world of the ecosystem

If you Google ‘wearables’ and ‘brand communication’ your search will return the keyword ‘ecosystem’ with a frequency too high to be ignored. Often a vacuous piece of marketeering, the word ‘ecosystem’ has come into its own as the best way to sum up the power of data aggregated from a number of wearable sources. “The success of wearables will not hinge on the hardware alone,” writes Julie Ask, Forrester analyst and co-author of the book The Mobile Mind Shift. “Success will hinge on the associated mobile apps and how effective they are in changing behaviour.” While consultant Mark Brill, blogging for content agency River, says the growth of the technology “will lead to a unique personal ecosystem consisting of different sets of wearable and connected devices. Today’s branded content is largely thought of as content delivered through screens. In the future we’ll need to think of content not just in terms of viewing but also in terms of hearing, feeling and touching.”

Wearables fits into the massive economic and cultural transformation driven by digital, a trend that BuzzFeed editor-in-chief Ben Smith put succinctly in an interview with The New York Times: “Technology isn’t a section in the newspaper any more. It’s the culture.”

So what does this mean for brands? It’s clear that traditional advertising methods aren’t going to work for wearables. In fact, Google has already vetoed advertising on Glass in its developer guidelines. Carl Panczak, CEO of digital agency Reactive New York, says that for advertising to work on wearables it needs to harness the key elements of context, location and personalisation. Brands wanting to take advantage of these new devices need to figure out a way to integrate themselves as services, become an indispensible part of the wearable experience and thereby build a valuable relationship with the wearer. “Current ads on mobile devices are frankly irrelevant and annoying,” agrees Amyx. “You cannot transplant a piece of creative onto a wearable like a smartwatch by simply making the format smaller. With wearables, context makes all the difference. Wearables can handle something that smartphones cannot, namely a constant interaction between the computer and user that enables ambient intelligence, ubiquitous computing and biometric data tracking. This creates a unique opportunity for brands to participate in a new dimension of brand engagement. As recommendations become more relevant and on target, the premise of advertising takes on an entirely different dimension as users will come to see value in information personalised for them.”

This is not possible with devices in isolation but is becoming possible as sensors connect with each other – so called machine-to-machine communication. When you couple personalised data to environmental data you create a powerful wireless sensor network which can be used “to assemble a holistic picture about the quantified self,” concludes Amyx.

Homo sapiens outnumbered, mobile devices rule

The startling statistic which we should be taking note of is not the figure for wearable sales but the unfathomably vast value for the internet of things (IoT), the global network set to unify consumers with inanimate objects via the web. Network manufacturer Cisco charts the IoT as a $19trillion opportunity for the world’s economies over the next decade, noting that by
next year the number of mobile devices on the planet will be greater than the total population. Agencies who aren’t already involved in the heavy lifting of big data will be left behind, notes Amyx, although the volume of data and the collection itself is less important than knowing how to read it and apply it. Statistical pattern recognition and building decision trees are two of many routes to actionable insight. “You need algorithm developers, computer programs, data scientists, specialists in machine learning who understand how to aggregate the right sources of raw data,” agrees AKQA’S Jones. “The level of insight you can garner changes products and creates new product and experience flows.” An example of machine-to-machine interaction has been explored by Collective London, who turned startup founder Sarah Buggle’s idea for an audio flyer into reality. The BUGGLE app allows the user to listen to live music being played at clubs and bars nearby – in real time – to inform their choice about where to go.

According to Amyx, app designers can no longer assume that the de facto user experience starts with a user turning on an app and then proceeding through a pre-defined user interface workflow. “Rather than requiring input from users, wearable apps will deliver high-value information in snippets as you go about your day. It will always be on in the background, listening and making sense of your context and activities,” he says. “What excites me about this is real-time brand engagement.”

“With geo and vertical location sensors we can pinpoint consumers in retail malls relative to different shops, and within shops to different display areas,” says Amyx. “There’s a tremendous amount of user information we can capture. We can quantify a user’s reaction to brands. Their respiratory rate, facial gestures or even gasps or sighs provide clues to their emotional state.” In return, the consumer receives instore deals on their smartglasses, or instructional videos on how to use a product. If they express an interest in a product, they could get help in finding the right one via live chat with a virtual store guide. Based on eye-tracking movements and length of gaze at a particular fashion item, product highlights might appear in their field of vision. “At no point, did the consumer have to stop to pull out a smartphone and launch an app,” notes Amyx. The information is granular and intimate, as if neurochemicals such as serotonin, adrenaline, dopamine and oxytocin had been measured to assess degrees of happiness. “Brands could know more about an individual than they do themselves,” says Amyx. “Not everybody is self-aware.”

Methods of analysing big data range from a branch of mathematics called Monte Carlo simulation – a computerised mathematical technique that lets you see all the possible outcomes of a decision and assess the impact of risk – to off-the-shelf tools, such as Google BigQuery and Apache Hadoop. Consumer electronics vendors are developing their own tools. The Samsung Architecture Multimodal Interactions (SAMI) platform aggregates health data from Samsung (Simband) and non-Samsung sensors to give developers API access to granular sensor data to run analytics.

Meanwhile, Amyx+ is collaborating with “one of the world’s largest agencies” to develop a networked approach to employing real-time brand engagement. Obtaining the data presupposes either a mass invasion of privacy or mass consent. The media industry is not alone in grappling with this issue, for which there are few regulations. On the one hand, fears
bordering on paranoia about the slippery slope to Orwell’s 1984 (see NSA whistleblower Edward Snowden’s opinion on the right of states to own our private digital activity for supposed national security); on the other hand the IoT could be represented by the type of benevolent, omnipresent OS with which we may fall in love (see the Spike Jonze film Her). “I think this is just the start of something pretty amazing – and scary too,” says AKQA’s Jones. “It has the potential to be Big Brother, knowing my location, my movement, how caffeinated I am. The point is to combine this insight with services the customer finds useful. If, for example, I am too highly caffeinated, perhaps I could use advice on maintaining regular sleep patterns.” We are prepared to give up our data if we receive value in return. “There has to be an exchange of function and features,” says Chris Matlub, founder and director of digital design agency 5K, which devised the VELA app for sailing enthusiasts wearing sports-oriented smart headgear ReconJet. “Maybe I give someone an application for specific use of a smartwatch and in return I collect data about how they interact with the world. It is up to the agency to go beyond the gimmick and understand how they can make it useful to the consumer.”

“The watch-out for agencies and brands is to be respectful of data,” warns Collective’s Martin. “This is more important when we start reading people’s biometrics. One issue is security. Already, smart connected cars have been hacked causing erratic braking and accelerating. Another issue is respecting unwritten agreements. It’s not unfeasible to think that [a company] might brand a product that monitors your health and if your blood pressure is high then the brand responds with alerts/advice. That’s fine until your health insurance premium goes up. It’s not the contract you agreed to but it’s what we are sleepwalking into.” Stanford School of Engineering’s MobiSocial Lab is seeking to combat such misuse by giving individuals back control of their personal data. Its communications platform Omlet is an open standard social network that lets users store and own all their data in a cloud.

Getting connected while still looking cool

Moving from data ethics to design, another facet of wearable tech is how it will develop physically. “The next step is about breaking through the screen,” says Matt Pollitt, co-founder of 5K. “At the moment we all interact with mobile devices via a piece of glass. It has created a digital barrier. Nobody looks up from their mobiles to go face-to-face. Exciting new experiences will happen when we can interact with the connected world unobtrusively by looking and behaving as normal.” Wearable tech is considered a clunky stopgap en route to an ecosystem that truly does hook us up to ‘the Matrix’. Today’s rudimentary plastic bands and self-conscious smartwear will become obsolete, replaced by nanomaterials and nanosensors that can tap our brain activity and are carried in clothing or in accessories, such as handbags. It will be a system that taps our neural network. “Until we can embed technology into our brains, our mobile device is the next best thing we have to sync our connected selves with our wearable technologies,” writes Matt Doherty, associate director of creative and global strategy at OgilvyOne Worldwide.

Brain machine interfaces have been around for a while, but they are now at workable level for brand experimentation. At tech expo SXSW this year, 360i devised a game, Think Flatizza, to launch a new flatbread/pizza product from food chain Subway. It used EEG-reading headsets to measure brainwave activity via electrodes to the temple. “Players were asked to focus on images of the Flatizza on a monitor that pitted one person’s mind against another in a virtual tug-of-war in a bid to win a meal by thinking happy Flatizza thoughts.” explains 360i’s Avnet. “We had people staring at the new product for 10 minutes at time.” Other examples include NuWave glasses, which help to amplify sound for the hearing impaired by transforming sound waves into vibrations; beauty tech designer Katia Vega has prototyped make-up products (false eyelashes, conductive eyeshadow) incorporating low-voltage circuitry to detect when someone winks and convert the action into a communication with other devices; Shoreditch design studio This Place has programmed MindRDR, a Google Glass app that translates brain
activity readings, collected by the NeuroSky headset, into commands to take photographs via Glass with just a few moments’ concentration, bypassing the need to vocalise a command. This could help ameliorate the slightly uncool element of some wearables. “Social norms drive our behaviour in public spaces, which is why Google Glass wearing jars with what we consider to be socially acceptable,” says Collective’s Martin. “The current generation of wearables is struggling to catch-on partly because the products lack intuitive control. Telekinesis is most useful in controlling devices without us having to do anything abnormal.”

Is telly controlling your mind? Or vice versa

Google is already taking the considerable learning it has collected about user behaviour from Glass and porting into interactive contact lenses, on which it has taken out a patent. “We can already use emotional analytics of a person’s verbal and non-verbal communication to understand their mood or attitude,” says Jones. “Telepathy is the next stage of that, where we can truly understand thought and instruct things to happen.”

At electronics fest CES this year, TV data analytics specialist Rovi demonstrated how viewers could tune into a channel using just their brain and a blink of the eye. Mind-control telly is but the first application. Just think what might happen if Netflix, say, knew what you were looking for and what reaction you gave to new content. Content could be commissioned or
decommissioned on the basis of that accumulated knowledge. Movies might be re-edited according to real-time feedback from trailers. The idea is already being explored by Technicolor. “This is the future of recommendation,” explained Philippe Guillotel, co-leader of Technicolor’s Open Research Group. “We are detecting your emotions from biological signals; it’s the same principle as lie detectors. In ten years there could be sensors on your TV that will propose relevant content to you according to your emotional state.”

Amyx believes that over time we will experience augmented reality [via Glass, contact lens or some yet to be invented media] on a daily basis. “As you go about your day the information received from the environment around you will be overlaid with a digital virtual world. Rather than having to invest in a large billboard, brands can deliver the same set of messages in a more customised manner and within that person’s field of view.” This represents a huge brand opportunity and a power that must be used wisely.