Data Science and the Influence on Decision-Making – From What You Add to the Cart… To Who You Vote For

You’re shopping on Amazon and heading for checkout when you see the note in your shopping cart: “To qualify for FREE Shipping, add $10.91 of eligible items.” And who doesn’t want FREE shipping? You head back to the homepage to add that Naruto Season 1 box set you’ve been eyeing but didn’t really want to spend any money on. It wipes out the shipping costs and you think: That box set is pretty much free!

… you’ve just been nudged by a data science algorithm.

In 2015, Amazon CEO Jeff Bezos noted in a letter to shareholders that the company offered up some 70 million automated machine-learned nudges each week to sellers alone in its ecosystem, using constantly evolving data analysis to help them:

  • Keep items in stock
  • Rotate hot products to the top of their storefront
  • Adjust prices to be more competitive

Amazon is not alone in this sort of activity. Big data is making big changes in human behavior in almost every corner of the world, in almost every field in which it can be applied.

Behavior Modification is Part of the Goal of Big Data Analysis

In some sense, the goal of most applied data science today is to change someone’s behavior in some way or another.

  • Retail data mining feeds marketing campaigns to convince consumers to buy new products.
  • Business intelligence reports sway executive decision-making in corporations.
  • Medical informatics alter the treatment plans developed by doctors and other medical professionals.
  • Traffic and GIS data change the preferred routes of drivers all over the world.

The outcome of all these subtle differences in action as influenced by the impact of data science is hard to measure. But there is little question that behavior modification is happening on a large scale as a result of the new availability of and reliance on data analytics.

In some cases, these changes in behavior are both predictable and intentional. When the end result of a data-driven marketing campaign results in more product sales, that is exactly the outcome that was hoped for and planned on.

Sometimes Data-Driven Nudging Creates Controversy

In other cases, however, behavior modification is either inadvertent or completely unintentional, and the consequences are less clear-cut. When Google’s Waze navigation application began deploying data-driven algorithms to present drivers with congestion-free commuting options that happened to go through residential neighborhoods, it started to turn some of those once-quiet streets into freeways themselves. Although drivers have always sought short-cuts, the data science solution to identifying and distributing those options had dramatic and unintended effects on quality of life in affected neighborhoods.

Even more interesting, as a result of the new driver behaviors, residents in those neighborhoods adopted their own behaviors in response: using the application to create fake reports of crashes, slowdowns, and other obstacles in an attempt to poison the data in the system and prevent it from routing through their streets.

In a more intentional and benevolent, yet potentially troubling, use of analytics to intentionally modify behavior, some colleges are beginning to use the vast amount of data they collect on students to attempt to nudge them into making better decisions about their lives and academic pursuits.

At the University of Virginia, for example, students identified by algorithms as being at risk of dropping out may receive customized text messages from the school suggesting resources where they can receive tutoring or other assistance.

A significant question about such uses of data science, however, are who decides what the “better” decision is and to what extent are these data-driven suggestions transparent to the people being nudged? Is a carefully constructed choice architecture the same as the freedom to make individual choices?

This aspect of big data-fueled nudging has taken on a more sinister cast recently as it has been applied to politics. The 2012 presidential election saw the Obama and Romney campaigns both apply sophisticated predictive analytics to nudge voters in the direction of supporting their candidate. The technique and methodology was little-remarked upon at the time, but it did demonstrate the capability of data science in swaying voters.

Those capabilities were put to more nefarious use during the 2016 elections, as Russian agents used them to alter the course of the presidential election via social media manipulation. Unlike previous campaigns, there were no checks or balances on the use of false information in the course of that manipulation, and no one to hold accountable. It provided an underlined demonstration of the effectiveness of data-driven nudging even using falsified information.

Although Behavioral Modification is Nothing New, Big Data Has Raised the Stakes

Whether we’re talking marketing or political campaigns, nudging by way of suggestion and encouragement to influence the public is nothing new—billboards advocating against drunk driving and warning labels on cigarette packages can be seen as sorts of nudges. Yet today, data science is able to devise statistically more effective, and more individualized, types of nudges and to use them so subtly that people may not even realize they’ve been influenced.

These uses of behavior modification are beginning to encounter challenges from privacy advocates, who argue that such manipulation amounts to regulatory regimes put in place by either government or private entities with no formal debate or rule-making process to govern them.

In something or an ironic turn, however, data scientists have also been investigating the use of nudging in order to push people toward making smarter choices about privacy and securing their personal information!

Of course, self-nudging can also come from data analysis, and it may be the solution that both vindicates data scientists and protects personal privacy rights. Through fitness and activity trackers such as the Fitbit, or via apps like RescueTime, which monitors how long you spend on the computer and at what activities, individuals can collect and analyze their personal trends and use the information to make better decisions about how to live their own lives.

These may be the best uses of nudging, but data scientists will have to do a lot of work to avoid creating unintended consequences with such services. The revelation in early 2018 that another fitness tracking device, Strava, published anonymized individual information about user activity on heat maps that inadvertently exposed the location of secret U.S. military bases worldwide, is only the latest in a series of such mishaps.

Data scientists will be called on the carpet for such issues, and pointing fingers at users will not be an acceptable response.

Wiley University Services maintains this website. We are an advertising-supported publisher and are compensated in exchange for placement of sponsored education offerings or by you clicking on certain links posted on our site. This compensation may impact how, where and in what order products appear within listing categories. We aim to keep this site current and to correct errors brought to our attention. Education does not guarantee outcomes including but not limited to employment or future earnings potential. View Advertiser Disclosure
Wiley University Services


©2024 https://www.datasciencegraduateprograms.com All Rights Reserved.