Digital disruption is taking place in retail more than any other sector. While customers are gaining benefits from easier access and smarter online commerce with friendly features, the real action is happening behind the scenes. Front-of-house retail starts deep in the backend with CPG companies compiling large amounts of data. We use data to gain consumer insights, test market reception and decide when and where we’re going to advertise. However, our digital backend is inadequately connected with both online and brick-and-mortar retail real-time information.
We’ve relied on legacy ERPs to track and manage goods. But these systems, built for long lead cycles and slow/steady feedback loops, simply weren’t designed to impact how items are priced, where they’re located, how they’re advertised and their ship place of origin at the pace at which consumers are now making major (and minor) purchasing decisions with two swipes and a click from their phone.
This deeper and faster method of surfacing, predicting and acting against real-time consumer activity benefits CPG companies that are restructuring for omni commerce. Some quick tools deliver scraped data (i.e. ‘Was my product mentioned on Twitter yesterday?’ or ‘Did I receive a bad review on younametheretailer.com?’). Some consulting companies deliver loosely modeled online category/market share hypotheses based on unverified directional signals. Some DaaS groups help teams extract daily sales attribution through partnerships they’ve forged with their respective online retail partners.
However, these independent data streams are proving ineffective for today’s decision makers. Our consumers are enjoying a seamless shopping experience thanks to the great work of online retailers, social media and even marketplace sellers, but the brands and manufacturers are unable to follow due to the lack of ongoing investment in their base computational technology.
We’re not talking a few mismatched SQL tables or an SAP system that seems to live outside the confines of item taxonomy. We’re talking a layer deeper. Like internet speed within your home, if the bowels of the tech stack are not clear and organized, the whole system lags. When the system lags, the users of the system log off (proverbially speaking).
The Missing Middle Layer
The “middle layer” is often spoken of as an integral part of any process — a good Monte Cristo has a three-part middle layer of ham, turkey and Swiss cheese. Without that unique combination, you’re left with two pieces of battered bread.
So are brands and manufacturers missing a key piece of technological aggregation, normalization and subsequent delivery between two slices of battered bread? It would seem so, as McKinsey reports the CPG space in particular is not measuring up relative to other industries like medical record keeping, finance, education and even (ack!) government systems.
Even the most technologically advanced CPG groups report their top priority heading into the 2023 budget season is to properly allocate time and resources to improve their current systems. Most companies typically rely on multi-million-dollar ERP systems to track and manage goods, but these current systems were not designed to juggle and re-sort/re-organize data points with the click of a button. Their flexibility is capped because it’s too expensive for them to rip out their old base technology (the technology builds that made them billions of dollars in the 1980s and 90s) to accommodate the lightning mechanics needed by today’s teams.
You may know, for example, how many units of a given SKU were sold at a given retailer last month, but what about yesterday? Analysts may also access the number of units sold at a different retailer last month, but information is siloed through the current delivery protocol. A more efficient approach, which makes the most out of the vast amounts of data being generated and shared through each retail partner, is to meticulously normalize those data points relating back to a single ‘SKU essence,’ as alternatively expressed by each differentiated retailer.
A completely unified view of one ‘SKU essence’ across all online and offline distribution channels delivered in real time has proven to be a game-changer for CPG legacy groups like Kimberly-Clark, General Mills and Sanofi. Similarly, up-and-coming challenger brands like SKIMS, Yeti and Essentia Water move even faster because they aren’t working off technology systems implemented and maintained over many years.
Understanding the Consumer Journey
That unified view allows the CPG manufacturer to better understand the customer journey. What a customer experiences, what they buy and how often and when, how much they pay, and what they may see featured in the aisles may differ between retailers. And more to the point, what the customer journey is at one retailer may better inform the CPG manufacturer’s decisions vis-à-vis alternative retail outlets.
One retailer’s consumer journey relative to a given product may be very different from another retailer’s perception. Understanding the ‘collective consumer/shopper journey’ across any and all online and offline selling outlets, with verifiable and accurate information cross-streamed, provides alert-centric opportunities as surfaced through simple predetermined business rules. These simple, quick moves unlock new opportunities to the tune of outpacing category competition up to 60% within a single quarter.
The consumer/shopper journey feels more erratic and unpredictable than ever. It feels this way because it is this way. In traditional brick-and-mortar assessment, a particular consumer may decide to buy an item from one retailer one week and a different retailer the next – possibly due to proximity to the store at the time, because one store was out of stock or another had a sale.
Data points accumulated and normalized over multiple retail outlets (including online purchasing and alternative delivery methods), followed by the cross-pollination of that data may feel daunting at first. How would a normalization of all these disparate pieces possibly unlock seemingly unrelated opportunities?
The truth is, most business analysts, replenishment managers, advertising coordinators, brand directors and logistics executives have always had an inherent hunch. However, these hunches have never been validated through quick and painless experimentation. By the time all the data points are in the right spot on the pivot table, the next 24 hours’ worth of information has rolled in.
Understanding data that may at first seem unrelated is often times quite interrelated, but trustworthy pattern recognition builds through repetition. As human analysts, we’ve not had enough exposure through verified repetition to actually make definitive bets pertaining to our consumers.
Is this even possible?
Many industries utilize technology, and simple data aggregation brings together data from multiple unrelated sources. How do you order an Uber or place a trade on the stock market? How do online March Madness brackets update automatically? How do schools rezone on a regular basis in cities with a high population growth? How do you get approved for a mortgage loan?
For manufacturers and brands, the data is already there. It’s been there for a long time. It just hasn’t been organized and feels like a tired tangled mess. Every SKU has a unique data mark that, when organized properly, can unlock a myriad of ‘hidden in plain sight’ opportunities. Moreover, the troves of ‘old’ data sitting in current systems can and should be quickly upcycled to establish early hypotheses, forming broad baseline assumptions off which new data inputs are able to arrange through computational organization.
Information, when amassed in quantity and parsed out for simple and quick user-generated decisions (think about how your Netflix queue gains relevancy over time) can be quickly used to weaponize anticipated future constraints. The best offense is a good offense.
Legacy ERP technology uses only a fraction of available data and does not achieve the level of granularity required by today’s up-and-coming decision makers. They know the market is moving fast because they themselves are consuming what they want, the way they want it, every day.
CPG leaders are experiencing their own behavioral shift as they go about their personal lives, and the blockers they have experienced and continue to experience associated with their work responsibilities specifically relating to data retrieval, predictive modeling and one-click to action execution feel so close, but so far away. Plug the middle layer and two pieces of bread become a unique, cohesive and substantial meal…available for online delivery or in-store pickup all day, any day, every day.
Meagan Bowman is Founder and CEO at Stonehenge Technology Labs. A game-changing SaaS subscription platform, Stonehenge’s STOPWATCH solution serves brands, manufacturers and their elected agency partners of all shapes and sizes. The channel-agnostic decision science delivered through STOPWATCH’s proprietary middle layer directs online-assisted, augmented and autonomous actions to be executed by CPG teams, to profitably grow their online sales distribution through all click-based retail transactions.