By Jason Lim
In an article titled, "What we buy can be used to predict our politics, race or education ― sometimes with more than 90 percent accuracy," the Washington Post describes a study done by University of Chicago Booth School of Business economists Marianne Bertrand and Emir Kamenica that shows that consumer behavior patterns can reliably predict whether someone is white or not in the U.S.
Funnily enough, whether someone owned a pet was the most reliable predictor of race. It was closely followed by ownership of a flashlight.
Consumer patterns do not only predict race. Other statistical vignettes that were almost as interesting include, "If someone went to Arby's or Applebee's or used Jif peanut butter, you might guess they were conservative. If they didn't own fishing gear or use ranch dressing, but drank alcohol and bought novels? Probably a liberal."
As a statistical analysis, the predictive ability of this study breaks down when applied to individuals. For example, I do not own a pet, but I have three flashlights stashed around the house and love the curly fries from Arby's ― not sure what that makes me. But there are companies out there that do not need academic studies or large-scale polls to predict ― actually know ― how I like to spend money.
Companies like Amazon, Google, Facebook, and Netflix know a lot about us. Viktor Mayer-Schonberger, Professor of Internet Governance at the University of Oxford, warns that predictive analytics based on big data collected around a person's behavior could allow these companies to "influence us not just in our buying decisions but in our political decisions."
This is not new. I am less interested in what the concentration of data in a few, select companies can do to influence us in our buying or political leanings. I am more interested in what they can do to prevent us from changing our minds and behavior.
We have seen multiple stories about how Facebook, based on a couple's pattern of activities on the site, can predict that a couple will break up even before they have made an explicit decision to separate.
Amazon can predict that we will need certain products before we even know that we need them, based on our previous patterns of purchases.
Uber is also crunching its ridership data to predict demand and allocate resources proactively. An Uber executive actually got in trouble several years back for boasting that Uber's analytics can tell whether someone just had a one-night stand or not.
And not just services. Don't forget the Internet of Things (IoT) whereby everyday objects are all being networked to transfer data. Your cell phones, cars, shipping containers, watches, and "things" of all kinds are being connected faster than we can blink.
If you consolidate the data that is being streamed from all the "things" that you use and all the "services" in which you engage, then someone somewhere has a very full understanding of how you live.
This is not what actually concerns me. We always make trade space decisions between privacy and convenience. In fact, much of our relationships with digital devices and services are based on trade decisions. If my tire treads are wearing too thin for the downpour the day after tomorrow, then I welcome my car telling me to get my tires changed. If Facebook newsfeed pops up an article based on my past clicking habits, that is fine as well. If Amazon drones drop off a new box of diapers before I run out, then I thank God for their predictive analytics.
However, what does intrigue me is the next evolution of this predictive analytics when it becomes preordained analytics. What I mean by that is that these companies ― by proactively offering services and conveniences based on your past behavior ― actually reinforce the same behavior in people.
For example, if Amazon drops off a carton of cigarettes every month, then what is the likelihood that this person will ever quit smoking? If Uber automatically sends a pickup service to your house every Saturday night at 10 p.m. to take you to a club, then who is driving whose behavior?
If these corporations ― now with an invested business interest in you continuing with your behavior ― actively work to reinforce your behavior, they are no longer predicting; they are preordaining your behavior. This also goes to your cognitive biases. If Google presents you with search results that cater to your political leanings, then how are you ever going to change your mind about something?
In other words, we are all "addicted" to certain patterns of behavior. If, for some reason, we want to throw off our "addiction" to certain behavior, we would want to create an environment where we do not have to face the triggering stimuli. This becomes impossible in the world of predictive analytics when companies will find every opportunity to expose you to those same triggers.
Jason Lim (jasonlim@msn.com) is a Washington, D.C.-based expert on innovation, leadership and organizational culture.
![]() |
Funnily enough, whether someone owned a pet was the most reliable predictor of race. It was closely followed by ownership of a flashlight.
Consumer patterns do not only predict race. Other statistical vignettes that were almost as interesting include, "If someone went to Arby's or Applebee's or used Jif peanut butter, you might guess they were conservative. If they didn't own fishing gear or use ranch dressing, but drank alcohol and bought novels? Probably a liberal."
As a statistical analysis, the predictive ability of this study breaks down when applied to individuals. For example, I do not own a pet, but I have three flashlights stashed around the house and love the curly fries from Arby's ― not sure what that makes me. But there are companies out there that do not need academic studies or large-scale polls to predict ― actually know ― how I like to spend money.
Companies like Amazon, Google, Facebook, and Netflix know a lot about us. Viktor Mayer-Schonberger, Professor of Internet Governance at the University of Oxford, warns that predictive analytics based on big data collected around a person's behavior could allow these companies to "influence us not just in our buying decisions but in our political decisions."
This is not new. I am less interested in what the concentration of data in a few, select companies can do to influence us in our buying or political leanings. I am more interested in what they can do to prevent us from changing our minds and behavior.
We have seen multiple stories about how Facebook, based on a couple's pattern of activities on the site, can predict that a couple will break up even before they have made an explicit decision to separate.
Amazon can predict that we will need certain products before we even know that we need them, based on our previous patterns of purchases.
Uber is also crunching its ridership data to predict demand and allocate resources proactively. An Uber executive actually got in trouble several years back for boasting that Uber's analytics can tell whether someone just had a one-night stand or not.
And not just services. Don't forget the Internet of Things (IoT) whereby everyday objects are all being networked to transfer data. Your cell phones, cars, shipping containers, watches, and "things" of all kinds are being connected faster than we can blink.
If you consolidate the data that is being streamed from all the "things" that you use and all the "services" in which you engage, then someone somewhere has a very full understanding of how you live.
This is not what actually concerns me. We always make trade space decisions between privacy and convenience. In fact, much of our relationships with digital devices and services are based on trade decisions. If my tire treads are wearing too thin for the downpour the day after tomorrow, then I welcome my car telling me to get my tires changed. If Facebook newsfeed pops up an article based on my past clicking habits, that is fine as well. If Amazon drones drop off a new box of diapers before I run out, then I thank God for their predictive analytics.
However, what does intrigue me is the next evolution of this predictive analytics when it becomes preordained analytics. What I mean by that is that these companies ― by proactively offering services and conveniences based on your past behavior ― actually reinforce the same behavior in people.
For example, if Amazon drops off a carton of cigarettes every month, then what is the likelihood that this person will ever quit smoking? If Uber automatically sends a pickup service to your house every Saturday night at 10 p.m. to take you to a club, then who is driving whose behavior?
If these corporations ― now with an invested business interest in you continuing with your behavior ― actively work to reinforce your behavior, they are no longer predicting; they are preordaining your behavior. This also goes to your cognitive biases. If Google presents you with search results that cater to your political leanings, then how are you ever going to change your mind about something?
In other words, we are all "addicted" to certain patterns of behavior. If, for some reason, we want to throw off our "addiction" to certain behavior, we would want to create an environment where we do not have to face the triggering stimuli. This becomes impossible in the world of predictive analytics when companies will find every opportunity to expose you to those same triggers.
Jason Lim (jasonlim@msn.com) is a Washington, D.C.-based expert on innovation, leadership and organizational culture.