Our ability to think and make smart decisions is eroding just as our environment gets more complex and harder to grasp with our traditional tools. Corporations, cities, markets, and governments are all technologies we’ve devised to manage complexity and make rational and actionable decisions in a hostile world. This time we face the rise of powerful new forces that undermine our very ability to react to these challenges and disruptions: our cognition itself is under attack. The early warning signs are troubling to say the least. Here is where we find a dangerous market failure. We all know that digital platforms are after our data. The digital platform, be it Google, Amazon, Twitter or Facebook, most likely gets more value from my individual data than it gives back to me in services. Our attention, our cognition, is a very precious resource. They’d be required to provide transparency in the ways that they used the world’s cognitive resources and pay a de-cognition tax on any activities that destroy cognitive ability, or divert cognition from other uses—the equivalent of a carbon tax. We should be given the information we need to better estimate the true price we are paying for the free services we consume.
The rise of complexity and the fall of decision-making
We have seen a dramatic increase in the amount of complexity that exists in the world. Mickey McManus’s book Trillionsnoted that as early as 2010, the semiconductor industry had reached the point where they were making more transistors than grains of rice, cheaper. Connectivity has amplified the global amount of aggregate complexity by enabling it to break out of any given domain and spread across the world. The rise of the so called “Internet of Things”—starting with mobile devices and now connected products and vehicles and platforms—is flooding every corner of our homes, factories, and communities. Everything becomes connected—to everything else and to us.
The global economy has also become inextricably interconnected; our society is more and more interdependent. Across multiple fields, our knowledge gets deeper and more detailed; we solve old problems and create new ones at accelerating speed. No matter our walk of life, today we are asked to grasp a widening range of increasingly complex issues: climate change, energy policy, advances in health care, the likely impact of robotics and Artificial Intelligence.
All these new sources of complexity are increasing the frequency and amplitude of positive and negative feedback loops into crashing waves and a torrential flood. There are no signs of this complexity leveling out, quite the opposite—the waves are getting more erratic and larger and larger. We are standing on the shores of a trillion-node-network tsunami-like event that has never been seen before. Worse this isn’t just a rise of passive information, but also a deluge of active machine agents. When trillions of things not only collect billions of bits of information but also demand our attention and change our environments dynamically on the fly, our ability to think, make decisions and take actions may be on the verge of collapse.
The coming together of digital and physical technologies has turned business models upside down and made it even harder for economic analysis to keep up. The “prosumer” concept of the 1980s is back with a vengeance as new technologies allow households to produce electricity and sell it back into the grid, and give them access to manufacturing power with affordable 3D printers. Economists struggle to explain the collapse in productivity that accompanied the latest surge in innovations—and that shows compelling inverse correlation to the rise of connected (and cognitive) devices like mobile phones; their cacophony of explanations ranges from the charge that new digital innovations have no economic value to the claim that they create massive value delivered for free, and hence not recorded in the official statistics.
YOU MAY ALSO LIKE
Our ability to think and make smart decisions is eroding just as our environment gets more complex and harder to grasp with our traditional tools.
Stone age tools for cognitive age challenges?
But wait, this is not the first time we face a rise in complexity and have to contend with multiple disruptions. We’ve faced tough challenges before and built structures to allow us to manage and make decisions at vast scales. Corporations, cities, markets, and governments are all technologies we’ve devised to manage complexity and make rational and actionable decisions in a hostile world. Steven Johnson—in his new book Farsighted—points out that we’ve evolved decision and scenario sciences to cope with increasingly complex issues—from the era of Darwin when he used the simple “pro/con” list to decide if he should get married (a non-trivial decision) to today’s advanced scenario-planning war games, science fiction foresight tools and other scalable management techniques.
This time, however, seems different—for a troubling simple reason. This time we face the rise of powerful new forces that undermine our very ability to react to these challenges and disruptions: our cognition itself is under attack. These toxic new forces leverage digital technology to exploit our behavioral biases, pushed by powerful financial incentives.
The early warning signs
What if the structures we had built to protect us against irrational decisions turn out to be rickety breakwaters laid down on the shore of a once placid sea and provide no protection from a 100-year flood? When the art and science of decisions-making itself collapses might we face a Great Cognitive Depression?
The early warning signs are troubling to say the least. Authoritarian governments and despots are enjoying a resurgence. In many democracies, voters faced with complex issues turn to simple answers and slogans, to the siren call of populism. They dismiss the experts (think of Brexit as a case in point), they look for scapegoats and easy fixes.
Could these be examples of human cognition reverting to evolutionary shortcuts to cope with complex threats? Authority bias is a quick way for us to decide things when we are faced with tough choices. If something is too ambiguous or non-deterministic we follow the authority figure with the most compelling and simple story, instead of doing the thinking for ourselves.
Social scientists have documented upwards of 200+ cognitive short cuts and biases that evolved to help us cope with danger, make decisions fast, and conserve our precious cognitive resources to fight another day. But sometimes those shortcuts have lived on far past their “sell by” date. Sometimes our brains lie to us. Buying behavior in our simian ancestors seem oddly similar to the ways humans make choices in markets. We believe we are rational actors but time and again we find out that it is very hard to see the thinking about our thinking. And now it’s getting harder.
Here is where we find a dangerous market failure.
A powerful combination of new technologies and financial incentives is fast overwhelming our old protective barriers.
Digital innovations are creating value. But this value is not given away for free, as some economists contend. There is no free lunch.
We all know that digital platforms are after our data. Sometimes they use it to our advantage, with more personalized offerings; often they sell it to advertisers. For them we are a different kind of “prosumer”: not a producer-consumer, but rather a product-consumer. We are more a commodity than a true customer. You might argue that well, almost everyone realizes this, and we still enter these transactions of our own free will, so what’s the problem?
But digital platforms are not just after our data—they crave our unwavering attention. Higher ratings command higher advertising rates—and the ratings are determined by how much time we spend with our eyeballs glued to the screen, our attention absorbed by the apps.