Brain Food: Augmented Cognition and Adaptive Interfaces

How machines will learn with us and help us make buying decisions
By
|
Posted May 4, 2017

Most of us share the opinion that we don’t like being “sold to.” We don’t like the pressure of being made to feel that once we are in a store, we’ve got to buy something; we don’t like the feeling of the hovering sales associate. It can feel like an invasion of our personal space. Equally, despite retail’s long history of the “pile ’em high and watch ’em fly” attitude to merchandising, we may say we love lots of choice, but on a brain level, research tells us that in the end, we’re less satisfied with the things we finally buy when we choose from a large selection.

One of the key benefits of having our digital devices interact with us in the shopping aisle is that they will change the nature of the engagement from hardcore selling to “suggestive assistance.” As our digital devices continue to learn about our buying histories, and algorithms are implemented to help predict future buying behavior, we will increasingly see brand experience strategies created that are focused on a customer’s individual needs. Marketing and merchandising approaches will move to supporting individual shoppers by simplifying buying decisions in a world of infinite choice.

An important aspect in this migration to declutter the shopping aisle will be a search for simplicity. Engaging shoppers in the buying process, while keeping them from merchandise and cognitive overload, will increasingly rely on the support of computers to make choosing simple. Instead of providing more options, we will look to technologies that can cut through the clutter and strip away everything that is unnecessary or simply uninteresting. In the shopping aisle, the heavy lifting of the buying decision process will be provided by banks of computers that will mine past buying activity and follow the paths of our other tendrils entwined within the “digisphere.”

The term “augmented cognition” is yet another step closer to the blending of man and machine. AugCog, as it’s referred to by researchers, is a field of study that is at the intersection of human-computer interaction, psychology, neuroscience and ergonomics and aims to create relational human-computer interaction. The basis of this more dynamic mode of interacting with computers is a human-computer interface that measures the real-time cognitive state of the user. Depending on the cognitive load, emotional arousal and the ability of the user to continue to demonstrate focused attention, the computer interface adjusts itself. It sets priorities and then curates the flow of information and energy being conveyed to the individual using it.

In the retail world, we have for some time talked about “the endless aisle,” choice beyond the products on the shelf. I’m not sure I want more choice. I just want to be able to make better choices. And, in a world where more is the default, I want help in pairing choice down to make better easier. Augmented cognition suggests that if the amount of information being directed at users is putting them into a state of cognitive overload, the computer is capable of reducing the flow of content so that individuals can remain effective in a number of tasks (aka making better choices).

Big Data is about following the digital trails of shoppers’ online buying behaviors. It combines our digital back-story with other variables, such as demographics, to provide suggestions about yet another item to put into the basket. It is a digital archeological dig that unearths history with the intent of providing a suggestive up-sell and/or predicting future buying behaviors.

Augmented cognition, on the other hand, aims to create adaptive interfaces that change in real-time to meet the needs of users in the moment. It captures data in the now, and modulates the interface/environment to best serve individual users and enhance their effectiveness at performing a task.  When we start to consider real-time augmentation of the shopping environment built on the digital life-stream of customers, and in situ data capture of emotional states, we extend the notion of adaptive interfaces to human-machine interaction at the shelf level, and perhaps, more comprehensively, to the entire store.

Remember, the brain both makes and is made by the experience of a place. What if our brain activity and the outward expression of inner neurophysiological states (as various biometric outputs – heart rate, perspiration, breathing pattern, etc, along with facial expression and body disposition), and the room (as an adaptive interface), were able to hold an intimate dance, exchanging information and energy, so that the room and the user engaged in the collaborative making of experience? What if our inner world could be translated into data that modified the place around us?

As we are more immersed in a world of digital integration, the innovative technologies being developed will transform the human-computer interaction paradigm. Our computers will become sensitive to the capabilities and limitations of the human side of the man-machine relationship and be enlisted to make it easier to shop. Facial recognition systems and adaptive interfaces are already in use in cars. Dashboard-mounted cameras can sense emotional states, levels of fatigue or distraction and play a role in driver safety by implementing vehicle-collision avoidance systems.

When it comes to using tools such as facial recognition systems in the shopping aisle, there is the potential to determine the emotional state of the shopper and predict buying behavior. We’re already able to use decoding algorithms to do more than simply tagging an image of a shopper’s face to a personal profile. Cameras can capture micro-facial movements and get a sense of how shoppers feel as they cruise through the front door and down the aisle. With this kind of data that determines emotional states, it is not unreasonable to assume that retailers will create environments that will adapt to help in decision-making and present targeted messages that are more relevant as shoppers scan the product assortment.

Future shopping places may evolve to change more than signage or offer in-store real-time promotions to smartphones. As technology is increasingly embedded into our environments, decoding the emotions we express while we shop will become an important factor in the way we interact in our digitally driven culture. When technology interfaces are enabled with the ability to detect physiological and emotional states, the architecture of digitally enhanced environments may morph, changing lighting, colors, environmental graphics, or even the merchandise selections presented to customers. When this can happen, we will be providing experiences that are truly unique to customers’ personal preferences, digital life-streams, and real-time neurophysiological reactions to their environment.

David Kepron is Vice President - Global Design Strategies with Marriott International. His focus is on the creation of compelling customer experiences within a unique group of Marriott brands called the “Lifestyle Collection,” including Autograph, Renaissance and Moxy hotels. As a frequently requested speaker to retailers, hoteliers and design professionals nationally and internationally, David shares his expertise on subjects ranging from consumer behaviors and trends, brain science and buying behavior, store design and visual merchandising as well as creativity and innovation. David is also author of “Retail (r)Evolution: Why Creating Right-Brain Stores will Shape the Future of Shopping in a Digitally Driven World,” published by ST Media Group Intl. and available online from ST Books. @davidkepron; www.retail-r-evolution.com.