Wearable Technology: How and Why It Works

We are witnessing the advent of entirely new categories of interface mechanisms that bring with them a fundamental paradigm shift in how we view and interact with technology. Recognizing, understanding, and effectively leveraging today’s growing landscape of wearables is becoming increasingly essential to the success of a wide array of businesses.

In recent years, we’ve seen new, disruptive innovations in the world of wearable technology; advances that will potentially transform life, business, and the global economy. Wearables like Google Glass, Apple Watch, Fitbit, Motiv Ring, and Oculus Rift promise to change the way we receive, use, and share data. The wearable revolution is also shifting long-established patterns of how we utilize data in our daily lives and social interactions. It’s a lot more personal.

null

Let’s look at the ways in which effective interface design will need to adapt (in some ways dramatically) in order to address the new psychology of wearable devices.

Let’s look at the ways in which effective interface design will need to adapt (in some ways dramatically) in order to address the new psychology of wearable devices.

The Wearables Explosion

Wearable technology has taken off in a host of directions once considered impossible. The device landscape has come a long way from the earliest wrist-sized calculators or the first Bluetooth headsets.

Smart glasses deliver digital interactivity as close as the wearer’s nose. Solos cycling glasses help cyclists with speed and fitness information in a simple heads-up display built on top of wrap-around shades. The highly-anticipated Vaunt by Intel promises to blend into the profile of regular spectacles, responding to subtle head-tilt gestures and transmitting only the most essential information to the user—all without the need of a bulky screen.

There’s even a rise in smart jewelry that delivers high-tech features through the most discreet of accessories. The Motiv Ring tracks fitness activity, heart rate, and sleep patterns in a slim, minimalist ring. Ringly goes a step further and alerts wearers to important notifications such as meetings and phone calls through its flashy gemstone.

With all of these new devices worn on the body, designers will have to consider from a new perspective, how users are going to interact with them.

Cognitive neuroscience is a branch of both psychology and neuroscience, overlapping with disciplines such as physiological psychology, cognitive psychology, and neuropsychology. Cognitive neuroscience relies upon theories in cognitive science, coupled with evidence from neuropsychology and computational modeling.

In the context of interface design, a neuroscientific approach is one which takes into account—or more precisely, is centered upon—the way in which users process information.

How people interact with new, never-seen-before technologies is more connected to their cognitive processes than it is to a designer’s ability to create a stunning UI. New, often unpredictable patterns emerge any time a person is presented with a tool, a piece of software, or an action that they have never seen before—to orient themselves, people use their cognitive processes as a “fallback” mechanism whenever something novel and unusual presents itself.

When designing UX for wearables, designers need to focus on how the user can best achieve their main goal. Develop a journey for the user by creating mental models and evaluating how to best align the user’s intuitive perception of the product and his or her interaction with the technology used.

Consider, for example, Google Glass Mini Games. In these 5 simple games made by Google to inspire designers and developers, you can see exactly how mental models play a major role in user engagement with the product. In particular, the anticipation of a future action comes to the user with no learning curve needed. When the active elements of the game pop up into view, the user already knows how to react to them, and forms an active representation of the playing environment without the need to actually see one. Not only has the learning curve been reduced to a minimum, but the mental model puts the user in charge of the action immediately, anticipating what the user will do and just letting them do it.

Bear in mind that it’s possible to identify three different types of images that form in the brain at the time of a user interaction, all of which need to be adequately considered and addressed to achieve an effective and sufficiently intuitive interface. These include:

  • Mental images that represent the present
  • Mental images that represent the past
  • Mental images related to a projected potential future
    And don’t worry—it is not necessary to run a full MRI on users to test what is going on in their brain to arrive at these mental images. Rather, simply test the effectiveness and universality of the mental images that have been built.

Designing for Contextual Use with Wearable

When approaching a new technology, it’s vital to understand how users experience and relate to that technology in context. In particular, a reality check is often needed to recognize how people would actually use wearable technology, in spite of how they’re “supposed to” (or expected to) use it. Too many times we’ve seen great wearable products fail because businesses were expecting the users to interact with them in a way that was unnatural. Clearly, not enough user testing was done.

An expensive “UXfail,” the original Google Glass was envisioned as a consumer wearable smart glass product, but did not catch on as anticipated. The same human-computer interaction models that work so well in the world of smartphones don’t apply as easily when the device is worn in front of the eyes, and directly stand in the way of the most natural face-to-face interactions. Other people become wary of the wearer, isolating the device user socially until they remove it.

Google has since learned from this UX failure to take into account the social context of Google Glass, and have refocused the product to have more industrial applications. Now meant for specific uses in the workplace and not the general social sphere, the wearable device has a more natural place.


The new version is sold as an enterprise product to meet specific industry use-cases.

The smartwatch, on the other hand, fits into the more natural behavior people are already carrying out. The wristwatch, which gained popularity in the late nineteenth century, was a wearable evolution of the pocket watch—much like the pocket-to-wrist evolution of the smartphone to smartwatch. New wearable technology builds on the regular habits of its users.

Designers shouldn’t jump on the latest, fanciest technology and build (or, worse, re-shape!) the product for that technology without knowing if it will actually be helpful to, and adapted by, users. This is an easy mistake, and it’s quite eye-opening to see the frequency with which it occurs.

Leveraging Multiple Senses When Designing for Wearables

Wearables bring the great advantage of being way more personal and connected to the user’s physical body than any smartphone or mobile device could ever hope for. Designers should understand this from the early stage of product development and stop focusing on hand interaction alone.


Vaunt smart glasses by Intel

Take the eyes, for example. Studies conducted with wearable devices in a hands-free environment have shown that the paths users follow when their optical abilities are in charge are different from what is expected. In spite of what is logical, people tend to organize and move in ways according to their instinctive behavior. They tend to move instinctively toward the easier, faster paths in order to accomplish that action—and those paths are never straight lines.

Take the Vaunt by Intel smart glasses. There is an adage in the world of highly-skilled UX designers: “Get out of the building.” If Intel had only been testing the product in the lab and not with a variety of users in the real world, it would’ve become just another major “UXfail.” Point of fact though, the jury is still out on Vaunt.

And what about our more subtle, cognitive senses? Wearables bring the human part of the equation more fully into account with a deeper emotional connection: Stress, fear, and happiness are all amplified in this environment. Fitness wearables, such as the Fitbit, use sensors to detect not only the wearer’s movements, but also heart rate and stress levels, helping the user understand the fluctuations throughout the day. Medically-minded devices like the Embrace by Empatica monitor sleep patterns, and can alert loved ones of seizure activity, promoting peace of mind for its wearers.

And what about our more subtle, cognitive senses? Wearables bring the human part of the equation more fully into account with a deeper emotional connection: Stress, fear, and happiness are all amplified in this environment. Fitness wearables, such as the Fitbit, use sensors to detect not only the wearer’s movements, but also heart rate and stress levels, helping the user understand the fluctuations throughout the day. Medically-minded devices like the Embrace by Empatica monitor sleep patterns, and can alert loved ones of seizure activity, promoting peace of mind for its wearers.

Designers take heed: Let the cognitive processes of the users lead, and not the other way around.

Voice User Interface (VUI) and Wearables

In the past, designing a Voice User Interface (VUI) was particularly difficult. In addition to the challenges inherent to voice recognition, due to their transient and invisible nature, VUIs are also fraught with major interaction hurdles.

Unlike visual interfaces, once verbal commands and actions have been communicated to the user, they are gone. One approach employed with moderate success is to give a visual output in response to the vocal input, such as on a smartwatch (Siri on an Apple Watch, for example). Still, designing the user experience for these types of devices presents the same limitations and challenges of the past.

When speaking with machines, a human user may miss the natural feedback loop of human-to-human conversation used to establish shared understanding. They will try to give a command or ask for something, hope the machine will understand what they’re saying, and give back valuable information in return. Most current voice AI is not sophisticated enough to build on the communication to reach understanding and may not be able to distinguish a new command from a clarification on a previous one.

Moreover, speech today as a means of human-computer interaction is still exceptionally inefficient. For example, it would take too long to verbally present a menu of choices. Users cannot visually track the structure of the data, and they need to remember the path to their goal. When presenting choices in a visual UI, designers lean on Miller’s Law and usually present a maximum of seven options. That maximum drops significantly when a user is expected to remember a list of options delivered verbally.

For VUI, the challenges are real. A sensible approach is to incorporate support for voice interaction but to limit its use to those places where it is most effective, otherwise augmenting it with interface mechanisms that employ the other senses. VUI will continue to improve and is a great option for visually-impaired accessibility. Accepting verbal input and providing visual feedback are the two most effective ways to incorporate a VUI into the overall user experience.

Designing Microinteractions for Wearables

While designing for wearable tech, designers will find themselves in an unusual habitat of spaces and interactions that they’ve probably never been confronted with before—neither have most of their users.

Grids and interaction paths, for example, are awesome for websites and any other setting that requires huge amounts of content to be handled all at once. With wearable devices, though, there is limited space for complicated interactions. In order to deliver the best experience to users, designers should rely on simple gestures and taps, accompanied by immediate feedback.

The newest Apple Watch 3 leverages microinteractions.

A great example of effective microinteraction design can be found in the Starwood Hotels and Resorts app for the Apple Watch. The Starwood application for Apple Watch perfectly fits their personal brand experience by letting users unlock their door in the hotel with a simple tap on the watch.

It isn’t necessary to see the whole process in order to appreciate what this kind of microinteraction can do. The guest is in the hotel and they want to enter their room without fumbling in their bag or pocket looking for the key. The app also shows one of the best practices for wearables—selective, contextual, minimalist design. Realizing it would disrupt the user experience, the designers refrained from piling on more functions or information than necessary. The watch only displays the check-in date, the room number and an “unlock” button.

Designing the Future of Wearable Technology

As the landscape of wearable technology expands and matures, designers will have new opportunities to influence how people interact with the digital world. New technology succeeds best when it fits into or enhances natural human behavior. This is true for every interface platform, not just wearables.

These devices are not meant to be interacted with in the same way as a laptop or a smartphone. Designers must consider how they are worn and how they can most discretely and efficiently gather and deliver information for the wearer. Some wearables even influence how other people react to their wearers, for better or worse. In many cases, the best wearable devices disappear gracefully into the background.

Wearables move technology from the screen into real-world contexts, presenting new and unique questions for designers to consider as well as challenges to overcome. It’s exciting having an opportunity to help shape the future of this technological revolution!


UNDERSTANDING THE BASICS

What is wearable technology (and give 2 examples)?

Wearable technology is a field of portable smart devices that are worn on the body. Wearables include devices like smart glasses, such as Google Glass, and smart jewelry, like Ringly.






Original Post: ANTONIO AUTIERO- https://www.toptal.com/designers/ui/the-psychology-of-wearables

Author: James Pikover
  • Share:

Have a Project? Let's Talk!




close-link