Ungluing from reality
I’m walking down the stairs to the train platform. Over 50 people are lined up behind a single turnstile, waiting to swipe their commuter card on the reader. The other turnstiles seem to be broken. No one’s walking through them, and everyone is visibly annoyed. They just want to go home after a long day of work, but there’s a traffic jam because the stupid turnstiles aren’t working and where are all our tax dollars going and why can’t the MTA get its act together and this city is degenerating quickly and…
Suddenly, an older woman breaks away from the pack and taps her card on one of the broken turnstiles. The light turns green and she walks right through, delighted at her discovery. A huge crowd of people notice her unexpected success and peel off from the main queue to form a new line behind the not-even-broken machine. I watch the trailblazing woman make her way down to the train platform.
My mind jumps to a passage from a book I recently finished: Algorithms to Live By. Imagine there are 10 companies bidding on the rights for a tract of land. One of them has data to suggest that the tract is rich with oil. Another’s data is inconclusive. The other 8 companies think the tract is totally barren. Being competitors, these companies don’t share data with one another. They can only observe each other’s behavior.
When the auction begins, the company with the promising data makes a high initial bid. The second company is encouraged by this bid to take an optimistic view of their ambiguous data and bids even higher. The third company has a weak survey, but now doesn’t trust it in light of what they take to be two independent surveys that suggest it’s a gold mine. They make a new highest bid. The fourth company does the same, and so on.
The consensus “unglues from reality” and an “information cascade” forms. Here’s how it works:
You don’t actually get to know the other [people’s] beliefs—only their actions. And it is entirely possible that their behavior is based on your own, just as your behavior is being influenced by theirs. It’s easy to imagine a bunch of people all going over a cliff together because “everyone else” was acting as though it’d all be fine—when in reality each person had qualms, but suppressed them because of the apparent confidence of everyone else in the group. Just as with the tragedy of the commons, this failure is not necessarily the players’ fault. An enormously influential paper by the economists Sushil Bikhchandani, David Hirshleifer, and Ivo Welch has demonstrated that under the right circumstances, a group of agents who are all behaving perfectly rationally and perfectly appropriately can nonetheless fall prey to what is effectively infinite misinformation. This has come to be known as an “information cascade.”
Our culture and processes are built on these sorts of cascades. See also: the housing bubble, cryptocurrency prices, Eichmann in Jerusalem. For the most part, we respect our elders and our institutions, and we’re biased towards the status quo. The cost of computing everything from scratch is too damn high.
When you decide to blindly follow others independent of your own information signal, your actions become uninformative to everyone who follows you. I should’ve known better, but I assumed the turnstiles were broken. If the cost of bucking a trend is low and the information you’d gain is (relatively) high, it’s probably worth breaking away from the pack. Most of the time, you end up looking like a fool. Sometimes, you end up making it to your train on time.