Behavioral science, which I just mentioned, is usually sold as a modern means of guiding us to healthier decisions about food and finances, among other areas, nudging us to do right rather than forcing us to. It’s billed as being avuncular rather than autocratic, paternalistic instead of despotic.
Even if that’s so, the field’s application is still often fairly creepy, marked by manipulation. It’s real noble contribution would be to teach us about the biases we unwittingly possess and the flaws in our thought processes, so we could analyze them and overcome these failings in time through the development of better critical thinking. Perhaps we’re only in the Proterozoic period of the discipline, and that’s what the branch actually contributes in the long run.
Until that more enlightened age, capitalism almost demands that abuses of the subject will be employed by enough players hoping to pad their bank accounts through “priming” and other predatory practices. Even if the efficacy of these methods is overstated, there’s still plenty of money to be made on the margins, prodding the more prone among us to purchase or politick in a particular way.
In a wonderfully thought-provoking New York Review of Books piece about Michael Lewis’ book The Undoing Project: A Friendship That Changed Our Minds, philosopher Tamsin Shaw argues convincingly that the “pressures to exploit irrationalities rather than eliminate them are great.” An excerpt:
In 2007, and again in 2008, Kahneman gave a masterclass in “Thinking About Thinking” to, among others, Jeff Bezos (the founder of Amazon), Larry Page (Google), Sergey Brin (Google), Nathan Myhrvold (Microsoft), Sean Parker (Facebook), Elon Musk (SpaceX, Tesla), Evan Williams (Twitter), and Jimmy Wales (Wikipedia).3 At the 2008 meeting, Richard Thaler also spoke about nudges, and in the clips we can view online he describes choice architectures that guide people toward specific behaviors but that can be reversed with one click if the subject doesn’t like the outcome. In Kahneman’s talk, however, he tells his assembled audience of Silicon Valley entrepreneurs that “priming”—picking a suitable atmosphere—is one of the most important areas of psychological research, a technique that involves offering people cues unconsciously (for instance flashing smiley faces on a screen at a speed that makes them undetectable) in order to influence their mood and behavior. He insists that there are predictable and coherent associations that can be exploited by this sort of priming. If subjects are unaware of this unconscious influence, the freedom to resist it begins to look more theoretical than real.
The Silicon Valley executives clearly saw the commercial potential in these behavioral techniques, since they have now become integral to that sector. When Thaler and Sunstein last updated their nudges.org website in 2011, it contained an interview with John Kenny, of the Institute of Decision Making, in which he says:
You can’t understand the success of digital platforms like Amazon, Facebook, Farmville, Nike Plus, and Groupon if you don’t understand behavioral economic principles…. Behavioral economics will increasingly be providing the behavioral insight that drives digital strategy.
And Jeff Bezos of Amazon, in a letter to shareholders in April 2015, declared that Amazon sellers have a significant business advantage because “through our Selling Coach program, we generate a steady stream of automated machine-learned ‘nudges’ (more than 70 million in a typical week).” It is hard to imagine that these 70 million nudges leave Amazon customers with the full freedom to reverse, after conscious reflection, the direction in which they are being nudged.
Facebook, too, has embraced the behavioral insights described by Kahneman and Thaler, having received wide and unwanted publicity for researching priming. In 2012 its Core Data Science Team, along with researchers at Cornell University and the University of California at San Francisco, experimented with emotional priming on Facebook, without the awareness of the approximately 700,000 users involved, to see whether manipulation of their news feeds would affect the positivity or negativity of their own posts. When this came to light in 2014 it was generally seen as an unacceptable form of psychological manipulation. But Facebook defended the research on the grounds that its users’ consent to their terms of service was sufficient to imply consent to such experiments.•