It seems like just about everyone on earth is obsessed with the promise of artificial intelligence. Read through any magazine or online publication and you’ll find a plethora of fascinating experiments, from pitting AI against nine-year-olds and seeing who could best master a game of Super Mario Bros. (advantage: machine) or engaging in digital tasks that require curiosity and out-of-the-box thinking (advantage: flesh and blood).
But while these experiments are often illuminating and deeply valuable, most are predicated around the desire to create AI models that are better and more intuitive; few, if any, are interested in using AI to solve the biggest challenge of them all — namely, helping us better explore and understand the ways we behave not in cyberspace but right here on earth, where all of us actually live.
So what can AI teach us about the human mind at work? To answer this profound question, just look inside one of the most unexpected places — the retail store. The insights we can glean there can teach us a lot about how we shop, not just for clothes and shoes but for just about anything we buy.
What do we know about the ways we behave when shopping? Easy! Ask anyone who has ever shuffled up and down the racks of their favorite shops and they’d likely deliver a checklist of truisms. Here’s one: Stores place bits and bobs next to the cash registers — nail polish, lotion, snacks and even socks — because shoppers waiting impatiently on line are likely to impulse buy these goods on a whim.
Advertisement
It’s a very convincing explanation. It helps make sense of the layout of just about every store you see. And it’s completely, absolutely and utterly wrong.
Our team at ShopperAI conducted simple experiments, observing shoppers through the simple security cameras most stores already have installed. Our software allowed for absolute privacy — no facial recognition, nothing invasive — but it also allowed us to observe the decisions shoppers were making IRL, as the kids say, in real life. And when we watched shoppers waiting in line to pay the cashiers, we saw something that ought to have been perfectly obvious: A whopping 92% of shoppers paid no attention to those items placed by the registers because nearly all of them, when standing in line, whipped out their phones and amused themselves by surfing the web, which meant that they were not attentive to consuming any other products on offer.
Huge conglomerates spend big bucks running very detailed focus groups, and ecommerce has built an empire offering all sorts of data-driven insights into what people want, do and buy. But that torrent of information, we now know, is not nearly enough to shed light on human behavior, a field of study as blissfully complicated and sometimes counterintuitive as the complex and fascinating species it studies.
Humans aren’t algorithms. We’re much more than mere probability machines. Our decisions are impacted by a whole host of considerations, and conducting field research by watching us in our natural environment, it turns out, is the only way to understand them in full.
This, for example, is why we know that the direction you walk around the store matters. See one product first, and then a second, similar one at a different price point, and it’s not necessarily the cheaper option you’re going to buy — first impressions matter! Or encounter a display of a specific product close to the entrance, and you’ll be much more likely to pick it up, if you’re even vaguely in the market for something like it, when you hit the relevant aisle. This is because priming, or the gentle but evocative invitation to pay close attention to something, actually works.
These may strike you as trifles; how we behave while we’re shopping, after all, is hardly the most decisive feature about us humans. But apply AI to real-life decision making and you will find some deep but subtle biases that deserve airing.
Recently, to give just one startling example, we had the opportunity to sit down with an executive responsible for marketing razors to women. Focus groups, he told us, indicated that women expected their products to feel feminine, which is why you see so many razors that come in bright pink. We didn’t argue: We merely showed him the data we’ve gleaned from analyzing the behaviors of real women in real stores, most of whom overlooked the pastel-colored alternatives and went right for the more masculine black- and silver-handled razors. Apply AI to real life, and you understand that women want a product that feels maximum strength, not something that comes across as softer.
It’s the sort of profound insight you get only if and when you use AI to study one of the final frontiers — our physical environments. Here’s hoping more and more businesses can jump on board and train artificial intelligence to help make the real world better for real people.
Sivan Friedman Joseph is the Co-founder and CPO of ShopperAI, a psychologist and an expert in decision-making. She started her career as a marketing strategist at Carasso, one of Israel’s leading real estate companies. During her time there, she studied gaps in consumer understanding and in static research such as surveys and market research. That compelled her to launch her own marketing agency specializing in customer analysis and personalized marketing funnels. When Joseph met ShopperAI Co-founder Lanor Daniel, they took their shared enthusiasm for understanding consumers’ buying decisions to establish Drill, a neuroscience research agency. After their research showed a gap in data insights between online and offline retail analytics, they created ShopperAI with a goal of closing the data gap facing in-store retail.