Retailers love new parents. Everyone knows that babies are expensive, but no one knows better than the stores parents buy all their stuff from. Gaining a new parent as a customer is a big windfall for retailers: milk, diapers, cribs, strollers, wet wipes, clothes, vitamins… for a big box store like Target, an expecting family is worth thousands of dollars.
Target had their analytics team sit down and figure out how to tell if a customer might be pregnant, even before any announcement was made. By analyzing purchasing trends, it turned out they could not only assign a level of likelihood that a customer was pregnant, but could even predict a probable due-date.<!- mfunc feat_school ->
Using that information, the company subtly began to mix in offers sent to those customers to include pre-natal items along with regular coupons.
When the New York Times broke the story, Target took a beating from privacy advocates. But, tellingly, the company kept the program intact. Though numbers were never released, apparently the program just worked too well to abandon it!
Such profit potential is a key part of the reason master’s-educated data scientists are in demand in the retail sector. But there are plenty of applications for data analysis beyond isolating retail goldmines in the form of expecting families:
- Predicting trends and styles of popular products
- Forecasting demand for goods and services
- Identifying likely customers
- Optimizing pricing structures
Hot or Not: How Big Data is Used to Spot the Next Big Trend
Reducing cool to a set of numbers crunched by nerds seems counter-intuitive, but there’s good evidence that it can work. Does this mean Malcolm Gladwell’s tipping point – the critical mass that starts a trend – can be predicted?
Well… Maybe. A 2015 article in the Tech Times described an algorithm that was able to predict, with some 65 percent accuracy, whether or not a song would be a Top 10 hit.
It’s easy to determine what was cool after the fact, but for retailers, the challenge is finding the hits before they get to market, and stocking up on and pushing those items.
Big players are hard at work on just that problem. Google, for instance, has an in-house “fashion data scientist.” Together with a product manager from Zappos and the founder of Shopintelligence, another retail analytics firm, she presented a panel at South by Southwest in 2016 under the moniker, “The Future of Cool.”
By looking at search trends and geographic data, Google had predicted spring fashion trends for 2015– accurately.
Shopintelligence and Zappos both use similar approaches toward more practical ends. Shopintelligence claims to provide a double-digit lift in conversion rates and average order value to retail clients that have subscribed to their predictive analytics services.
Crunching Numbers to Get the Right Products in Front of the Right Customers
Creating a terrific product only works if you can identify the people who want to buy it and get it in front of them when they’re ready to make the purchase.
To that end, data scientists in retail are working to collate information about shoppers to identify individual purchasing preferences and make extrapolations to determine what else those customers might want to add to the cart.
For online retailers, this is relatively straightforward. Amazon knows everything a customer browses and buys and can instantly plug that data into an algorithm. Based on the selection patterns of many millions of other consumers and their experiences, Amazon’s proprietary algorithm then recommends other products users will probably enjoy. The algorithm is constantly tweaked and adjusted through the real-time feedback of how users rate the vendor and their experience with the product, as well as what they do or do not choose to buy.
The Challenge of In-Store Data Collection
Traditional brick and mortar retailers struggle to accumulate this sort of information about their customers. The many loyalty card programs they offer are an attempt to collect information with a similar level of detail.
Retailers take what they can get and make use of more generic information too. For years retailers have monitored how shoppers move through their stores, what paths they take and the sequence in which they typically select items. This more general information is then used to conduct detailed analytics of conversion based on factors like product placement and signage.
Collating that data across thousands of locations helps retailers optimize product placement and advertising to catch the eye of shoppers that fit the right profile. Some products are even removed entirely to reduce clutter and make it easier to find in-demand items.
It has long been a common practice to place items with a higher price point in more visible locations, while it might take some searching to find something comparable from a lesser-known manufacturer offering their version at a lower price. This rather weak attempt at pricing optimization which is so much easier for retailers to accomplish in the online environment is based on the principal that those reluctant to buy at a certain price point will take the time to look for an alternative, while those willing to pay the highest price for a top quality product – perceived or otherwise – simply place the most recognized version in the cart without a second thought. None of these practices are new and none of them happen by accident.
Data science simply takes an old concept to a whole new level.
With Location-based Advertising, Products Find the Customer
The next big thing in retail signage may come from another application of data science: location-based advertising. Stores are experimenting with NFC (Near Field Communication) transmitters and location-based app signaling to use phones to alert customers when they are close by a deal they might want to take advantage of.
Retailers now routinely identify customers near retail locations, offering them deals via e-coupon and the ability to pay through their phone. In some cases, this has had a major impact on the bottom line. In New York, ice cream chain Van Leeuwen’s pings customers when they are close to a store. This successful program has grown to produce five percent of the chain’s revenues over the past few years. Apples iBeacon, and a similar product developed by Google called Eddystone, use Bluetooth technology to work in concert with hardware transmitters – “beacons” – installed in retail locations that push out notifications and offer special deals when a customer is within immediate proximity to a particular item.
Timing is Everything … The Right Price at the Right Time
As the old saying has it, everything is for sale at the right price. The key for retailers has always been in figuring out what that price is. With big data analytics, that number can not only be determined for the market in general, but it can also be calculated with some level of precision for individual consumers.
This level of price optimization is easy to accomplish on the data front, but much more difficult to square with customer satisfaction.
One of the earliest iterations of price optimization was discovered when customers comparing notes about DVDs they bought through Amazon found that each of them had paid a different price, apparently based on the web browser they were using. Amazon admitted to the price differentials, claiming it was part of a usability test, and offered to refund money to those who had paid more. But Amazon was careful not to raise too much curiosity about the algorithm behind the program.
Price optimization is as old as retailing—hot products command higher prices, while discounts may be offered on items that aren’t moving fast enough. And, of course, there’s nothing new about senior discounts. However, the idea that something more individual than market demand or when you were born is involved in deciding what to charge makes many people uncomfortable.
Opening the Valve on Data-Optimized Pricing
Amazon wasn’t alone in individualizing pricing. Other online retailers, including Staples, Home Depot, Discover, and Orbitz, also vary their pricing based on secret formulas. As revealed in a 2012 Wall Street Journal exposé, the biggest single factor used to simply be location. But with increasingly detailed profiles of individual users available from their browsing and purchase histories, many companies are beginning to tailor their offerings to what they think a particular customer will be willing to pay.
The practice is finding acceptance in some demographics and data-optimized pricing is making big money for some retailers. Bellevue Washington’s Valve Software might be the poster child for this approach.
Valve’s Steam platform is an online video game software distribution network. Customers log on from anywhere in the world and purchase games to downloaded digitally to their devices.
But the real genius to Steam pricing is that it adjusts regularly to hit customers at different price points. A rabid fan might pay nearly $70 for a big game title the day it comes out—or even pre-purchase it before release. But a more casual gamer might find the game only worth $10, and simply choose not to play if it cost more.
Valve started to exploit this by slowly reducing the cost over time, hoping to hit customers at every price point eventually. But they found something strange when looking at the data. By running a big sale, early on, they could increase revenues by a factor of 40. Moreover, games sold in that pattern continued to enjoy higher sales even after the sale was over. The word-of-mouth bump from wider distribution helped to “time-shift” revenues forward from the casual gamers.