X-Events, by John Casti

X-Events

An “X-Event” is some sudden catastrophe, or the trigger for a catastrophe, that may not be predictable in its particulars but can be foreseen to at least some extent in general terms.

So if you go out drinking every night and drive home drunk, you know there’s a significant risk something terrible is going to happen, even though you can’t predict if it’ll happen on a Friday or on a Saturday, or on Nottingham Road or Vernor Avenue, or will involve running over a kid on a bike or crashing into an oncoming SUV when you’re in the wrong lane. You just know things are set up for disaster.

In X-Events, John Casti identifies numerous areas of modern life that have “uh oh” set-ups like that, where you can see we’re firmly ensconced in a handbasket, even if you can’t make out in detail the Hell we’re headed for.

He puts these cases in terms of “complexity,” a semi-technical term that I only partly grasped. It’s like when scientists talk about “information” and it turns out to be something things like subatomic particles can have. I’ve never had better than a vague understanding of that sense of the term, since to me information is in books and lectures and such and makes no sense outside of the context of entities that use and understand language.

But anyway, an X-Event becomes more likely, he says, the more complex a system becomes, or at least the bigger the gap becomes between the complexity of the system and the complexity of whatever it is that is supposed to control the system.

An example he gives is the financial meltdown of 2008, triggered by all those exotic packages of financial products that no one understood. What the banks and investment houses and such were doing in manipulating all these weird transactions involving weird entities was beyond the ability of government regulators to keep up with and police—their activities were more “complex” than those of the regulators. As such, the system was bound to sooner or later spin out of control and then collapse.

Aside from whether they’re best described in terms of complexity, certainly there are many ways that we live “closer to the edge” because such ways are more efficient or convenient for us (or, more likely, increase corporate profits) 99% of the time. Unfortunately they also greatly raise the stakes the 1% of the time things don’t go well.

One example—this is not from the book—is the fact that stores typically keep far less in stock than in the past, as it has been found to be more efficient to keep only what you expect to sell in the short term. That frees up plenty of space, since the less inventory you have on hand, the less you have to store. Now instead of being in your store’s back room, those items you expect to sell next week or next month are on a truck or train en route to you, or they’re still at the factory, or they’re not even manufactured yet.

That’s great if you’re Toys ‘R’ Us, but if you sell food or something essential to life, then if there’s a major disruption to the transportation system—a hurricane, terrorist attack, etc.—your shelves get emptied out mighty quick and then there’s no food available for who knows how long.

Or think about how much our lives are intertwined with computers (and phones and all the rest). That’s enormously beneficial on both an individual and societal level in many respects, but it also makes us far more vulnerable when things go wrong. With everything from your financial records to your photos to your porn digitized and housed in your computer, you’re always vulnerable to losing everything.

True, in the past your house could have burned down and all your paper records and photo albums and such could have been destroyed, but what happens more often: Your house burns down or your hard drive crashes? And even if you remember to back things up to an external hard drive or a “cloud” somewhere, those things also fail more often than your house burns down. Not to mention the more you disperse your information the more likely it is to be stolen.

On the level of a national economy, or the world economy, again having everything computerized like that is almost always great and increases wealth. Except when there’s some glitch that sends the stock market into an inexplicable tailspin.

It’s like we’ve discovered this whole wonderful world above us where people walk around on tightropes, and the weather’s nicer and there’s no pollution, and everyone’s more attractive, and money grows on trees and all that, but, uh, everyone’s walking around on tightropes.

Casti includes chapters on eleven systems that could give rise to X-Events, eleven ways we’re potentially headed for disaster.

He leaves out ones that there’s little we could do anything about, like some giant asteroid striking the Earth.

That’s fine, but he also leaves out climate change, on the grounds that it’s something that’s already gotten enough publicity and that everyone already knows about. I find that a more dubious decision. I’d think a book like this would be more useful if it identified the most likely or most urgent potential disasters rather than the least known potential disasters.

Few if any of the X-Events he envisions are human extinction-level catastrophes. They’re somewhere below that and somewhere above Hurricane Katrina-level devastation.

Some of the scenarios he sketches out he admits are quite unlikely, but he regards them as sufficiently possible to be concerned about and take countermeasures against. Others he regards as probable if not certain.

The latter are a little scarier, even when they wouldn’t cause as much havoc as some of the more fanciful ones. He regards it as inevitable, for instance, that the Internet will collapse. Not collapse permanently as in there won’t be an Internet anymore, but he thinks the system is so complex and so overloaded that it’ll break down soon enough and have to be rebuilt and thoroughly reorganized.

I don’t have the background knowledge to assess the specifics of this book, to agree or disagree with the likelihood Casti attributes to certain scenarios. What I can say in more general terms though is that I’m inclined to be at least as pessimistic as the author.

It makes sense to me that as things become more complex and we and our systems become more interdependent, our vulnerability increases (just as the benefits do, as I noted earlier).

Maybe some of these risks are overblown. Maybe it’ll take longer than he thinks for some of these systems to reach the point of collapse. My pessimism isn’t so much about the magnitude or the nearness of the problems as about our solving them.

Probably some or all of the things he writes about—as well as countless other dreary scenarios—are avoidable. Not just avoidable if we happen to get lucky, but avoidable in the sense that we are capable of identifying what needs to be done to avoid them and we are capable of then doing that.

But unfortunately we are not a rational species in that sense. I don’t mean that people are stupid. Or I don’t mean just that; actually an alarming number of people are indeed quite stupid. The problem is we’re collectively more stupid—and more evil—than we are individually.

Consider—to appeal to a much-used example—the insanity of military spending, or nuclear weapons specifically. If an individual was making a decision for the whole human race, seeking to ascertain whether our present expenditures on nuclear weapons is the best possible allocation of those resources, I dare say fewer than 1% of folks would be stupid or crazy enough to say that it is.

But collectively look where we’ve ended up. When everyone is a cog in a giant system, they make the decisions that seem best to them—whether through a rational assessment of the situation, or more likely because it’s their job or will make them the most money or whatever—and the cumulative, unintended, effect is often something outrageous.

We tend to blame our leaders for that, but I don’t know how fair that really is. There are limits to how much anyone, leaders included, can simply will the world to change.

Let’s say the President is a wonderfully intelligent and well-meaning fellow, and he comes to understand from consulting the top experts in the field that the Internet will collapse soon if we don’t take certain drastic preventive actions now (or the power grid will fail, or there will be a worldwide pandemic caused by some new killer virus). So what?

The President, or any individual, might well have sufficient foresight, ability to delay gratification, ability to match means to ends, etc. to take a step backwards now in order to take two steps forward later, or to pay a large cost now in order to avoid having to pay a larger cost later, but does society?

I think the answer for the most part is no. We don’t prevent problems; we (try to) solve them. We’re always responding to the last crisis, not anticipating the next one. Closing the barn door after the horse has bolted is a perpetual human phenomenon.

We’ll make certain (probably inadequate) changes to our health care system after a pandemic shows its vulnerability, not before.

Look how excruciatingly difficult it is to get any meaningful action taken concerning climate change. No matter how much an individual rationally assessing the evidence might come to believe drastic action is justified, as a species we’re not going to do much of anything collectively about it until something dramatic happens that is undeniably tied to climate change, like if rising sea levels inundate a good portion of Manhattan.

It doesn’t help that a sizable proportion of the population is intentionally and maliciously acting so as to make these disasters more likely. It’s not all honest mistakes and stupidity. Corporations and those beholden to them are knowingly lying about climate change because it’s good for their short term bottom line, just as they knowingly lied about smoking. The Internet isn’t being overloaded just by an increase in legitimate activity as the population increases and economies expand; it’s in the state it’s in due in part to the nefarious behavior of spammers, hackers, etc. and the countermeasures that have to be taken against them.

Bad as we’d be anyway at anticipating and preventing disasters—X-Events—it’s a lot worse when we’ve got corporate propaganda and such pulling us in precisely the wrong direction.

In conclusion, again, I don’t feel qualified to assess the bulk of X-Events. But I suspect that it’s close enough to correct to warrant concern about multiple of the scenarios it contains, and I further suspect that the possible disasters will have to be a lot more obvious to more than just a few scientists or geniuses or whatever before we do much about them, which means we’re in store for a lot of avoidable damage.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s