You know there’s a new nonfiction genre by the titles alone — Blink, Nudge, Predictably Irrational… and now Sway. This book is probably best compared with Dan Ariely’s Predictably Irrational, but to me the Brafman brothers’ book seemed easier to digest — partially because it’s shorter, but also because it doesn’t seem to discuss as many experiments in as excruciating detail as Ariely tended to do.
The thesis is largely the same — we humans think we act rationally in most situations, especially in business or areas of our life that would seem to call for rational thinking (e.g., work). What Ori and Rom Brafman (a businessman and a psychologist, respectively) show instead is what we all know from Ariely and others before him — humans are irrational and will act in irrational ways in many (most?) situations. They call this being “swayed,” hence the book’s title.
Sway opens with a convincing example — the historic KLM flight where the pilot made a seemingly irrational decision that cost the lives of 584 people in 1977, the largest airline disaster in history. The authors make the argument that because the pilot was so focused on getting to his final destination after being diverted, he was swayed into making a wholly irrational decision which ended in tragedy. How was he swayed specifically? Well, the book revisits the KLM disaster a few times to flesh out the underlying irrational decisions likely being made by the pilot (I say likely, because the pilot died in the crash as well).
The book is peppered with such examples, such as people who have bid as much as $200 for a $20 bill. Why? Why would anyone pay more than the face value of a $20 bill? Well, the authors have a few answers for you. (The answer to the $200 twenty-dollar bill question is risk aversion coupled with a feeling of commitment to the auction taking place. Since there’s still a small chance one could “win” the auction, some will continue to bid higher and higher, long past the point of rationality.)
Here are the major, salient “sways” mentioned in the book and their descriptions:
- We often overreact to potential losses, focused more on the short-term consequences rather than the longer-term effects.
- The more meaningful a loss is, the more loss averse we become, meaning we don’t want to give up our hold on the loss (even when it’s economically, emotionally or otherwise beneficial to do so)
- We hold on to the pervasive pull of commitment. When we are committed to a relationship, decision, or position in our lives, it can be very difficult for us to see the better, healthier alternatives available.
- Humans have a tendency to imbue someone or something with certain qualities based on its perceived value rather than objective data. This is called value attribution.
- If we see something labeled a certain way, we’ll take that label at face value.
- The authors have two amusing examples of value attribution at work — a world-famous violinist is mistaken for a street musician in the subway; a SoBe energy drink is only as valuable in helping improve your memory as you think it is
- When things are discounted off of their regular price, people tend to give the product or service a reduced value attribution. In other words, when we get a discount on something, we tend to unconsciously value it less than if we had paid full price.
- Humans have a propensity to label people, ideas or things based on our initial opinions of them. The authors term this the “diagnosis bias,” and it includes our inability to reconsider those initial value judgments once we’ve made them.
- Again, the authors bring this sway to life with their examples of how players perform directly in relationship to their NBA draft pick number, amongst many others.
- A single word or label can color our entire perception of a person, closing off avenues of shared experience and seeing people for who they really are. Once a person is given a label (and even directly, a diagnosis), it’s hard for people to see people in a way that isn’t biased by that label.
- The authors also note that hiring interviews are bogus, completely unscientific and at the end of the day, quite terrible at helping managers choose a good employee
- “Mirror, mirror” effect – we like and look for people like us
- We constantly sway others and are constantly being swayed by our expectations and labels — what the authors call the “Chameleon effect.”
- People want and expect fairness in all of their dealings with other people, companies and organizations.
- It is vitally important for people to feel they have a voice. People want to be listened to and heard, even if nothing changes.
- Talking through our reasons for a price or our position in an argument or debate, explaining how we arrived at it, and communicating what we feel is the fair thing to do makes other people feel like we’ve treated them more fairly and reasonably.
- We can either approach a task altruistically or from a self-interested (opr pleasure) perspective, but usually not both at the same time.
- When the two centers of the brain (altruism and pleasure) compete, pleasure usually wins.
- When the pleasure, self-interested perspective is operating, unexpected behavior or effects can occur.
- It’s not that rewards for specific tasks or behavior are bad, it’s the possibility of a reward dangled ahead of time that can potentially result in destructive, unintended effects.
- It’s okay to reward someone after the fact, but don’t always create the possibility of the reward ahead of time. And know that money defeats/negates altruism.
- Groups can have profound effects on our ability to reason rationally.
- Dissent is invaluable – you need a dissenter, even if you don’t agree with the specific dissent itself.
- Dissenters open up discussion and allows to express their views.
The one place the authors don’t really sway me is their attempt to explain why bipolar disorder is diagnosed so much more often than it was a decade ago. Unmentioned by the authors is the fact that many other mental disorder diagnoses have also experienced a significant increase in their use from a decade ago.
They link the increase to two factors – the modern diagnostic system put into use in 1980 with the publication of the DSM-III, which “broadened” the bipolar diagnosis; and pharmaceutical advertising in the 1990s. Left out of this explanation are some of the reasons proffered by the actual researchers of the study (Moreno et. al, 2007).
So what did the researchers who actually penned the bipolar “forty fold increase” say? Well, they were far more cautious about suggesting possible causes for the increase in diagnoses. But they did note that many of the symptoms of bipolar disorder overlap with other mental diagnoses, which could also be, in part, reason for the increase. For example, in a study conducted in 2001, nearly one-half of bipolar diagnoses in adolescent inpatients made by community clinicians were later re-classified as other mental disorders. Here is what one of the researchers of the study actually said:
“It is likely that this impressive increase reflects a recent tendency to overdiagnose bipolar disorder in young people, a correction of historical under recognition, or a combination of these trends. Clearly, we need to learn more about what criteria physicians in the community are actually using to diagnose bipolar disorder in children and adolescents and how physicians are arriving at decisions concerning clinical management,” said Dr. Olfson.
Sway’s authors’ suggestion that the increase in bipolar diagnoses is related to the modern diagnostic system seems to be reaching. If the DSM-III was the cause of the forty fold increase from 1994 to 2003 in bipolar diagnoses, why did it take more than 14 years to even reach the lower 1994 levels, long before the increase occurred?
The authors also link the diagnostic system back to its founder, Emil Kraepelin, and imply that the DSM-III (and its current version, the DSM-IV) have no links to “hard science” (whatever that is). Of course that’s not true – the DSM-IV is nowadays very much based upon empirical data; Kraepelin’s original categories have largely been discarded in the modern version. Kraepelin’s concept of bipolar disorder in the early 20th century was that it included both the modern version of “major depression” and what we now call “bipolar disorder.” He did not, however, describe bipolar disorder as know it today and the authors’ implication that this diagnostic category remains largely unchanged for nearly a century is just ludicrous.
As for pharmaceutical advertising, that’s likely a stronger link to the increase in diagnoses. Advertising largely works, otherwise companies wouldn’t bother. This too wasn’t a hypothesis of the researchers.
But neither explanation really goes to any irrational behavior on anyone’s part. Yes, once a patient is diagnosed by a mental health professional, diagnosis bias kicks in – we tend to view the person only in the filter of their diagnosis (and most other professionals will adhere to the original diagnosis, perpetuating the bias).
What the Brafmans do show is that diagnosis bias can lead to the patient themselves changing their behaviors to also fit the diagnosis. Once people are labeled, they tend to live up (or down) to those labels, or take on the characteristics of the diagnosis. The authors call this the “chameleon effect,” which is a person’s taking on positive or negative traits assigned to them by someone else.
Strategies for Change
The Epilogue of the book offers strategies for people to try and disarm the irrational sways discussed throughout the book. While the advice is well-intended, it is likely too simplistic to really help anyone combat these sways. For instance, “The best strategy for dealing with the distorted thinking that can result from value attribution is to try to observe things for what they are, not just for what they appear to be.” Great advice if you can follow it, but the problem with human behavior is that – in the moment – we often forget such words of wisdom. We are working against a lifetime of learned behavior, so it’s not so easily undone as simply “observing things for the way they really are.”
“Our natural tendency to avoid the pain of loss is most likely to distort our thinking when we place too much importance on short-term goals. When we adopt the long view, on the other hand, immediate potential losses don’t seem as menacing.”
“When things go wrong, we can either apply a short-term, Band-Aid solution or remember that in the grand scheme of things, it’s only a minor misstep. Having a long-term plan – and not casting it aside – is the key to dealing with our fear of loss.”
And I guess that’s the real problem at the end of the day. We can learn everything there is to know about human behavior – and certainly Sway provides a lot of learning – and still be left making irrational decisions in our lives. Having learned about a few more irrational “sways” in life, I do feel better prepared to recognize them when I encounter them in the future. Whether I’ll actually be able to do anything about it when the time comes, who knows?
But I’m hopeful, because I know humans can change their behaviors – even lifelong ones. It just takes practice, patience, and time. And trying not to be so easily swayed.
Moreno, C., Laje, G., Blanco, C., Jiang, H., Schmidt, A.B., & Olfson, M. (2007). National Trends in the Outpatient Diagnosis and Treatment of Bipolar Disorder in Youth. Arch Gen Psychiatry. 64(9):1032-1039.