Note: I enjoy writing and discussing things other than education and policy. But I know people mostly read this blog for this type of content. So, going forward, I’m going to post non-education pieces on Saturdays. I hope some people will enjoy (and respond) to the posts; and for those who do not, simply skip this blog on Saturdays.
David Brook’s recent column “Our Machine Masters” has the right headline but the wrong content. Usually, with a columnist as talented as Brooks, the opposite is true.
Brook’s argument is as follows:
1. Artificial Intelligence (AI) is exploding.
2. This will lead to smart machines, which won’t be humanlike geniuses, but will be more modest machines that will drive your car, translate foreign languages, organize your photos, recommend entertainment options and maybe diagnose your illnesses.
3. This could lead to either a humanistic or utilitarian society.
4. In the humanistic world, “machines liberate us from mental drudgery so we can focus on higher and happier things.”
5. In the utilitarian world, “the machine prompts us to consume what is popular, the things that are easy and mentally undemanding.”
Siri is a Tree, Not a Forest
The first mistake Brooks makes is that he’s not zooming out far enough. There have been a couple of game changing events in human history (some call them singularities). The first was the formation of language; the second was farming; the third was the industrial revolution. All of these moments fundamentally changed what it meant to be human.
Currently, we’re in the midst of computer revolution. To date, how we store, acquire, and communicate information has been transformed. Other changes, such as how we use data to make choices (which is what Brooks is writing about), will likely occur.
My instinct is all of this is a foreplay. The next singularity is coming, but this is not it. Consider Siri akin to the grunts made before language eventually formed.
Whether this all leads us to be a little more “humane” or “utilitarian” is a minor consideration.
The Singularity is Near
I won’t spend much time here. People much more knowledgeable than myself have written on what a technological singularity might look like and when it might occur. Perhaps most important is that we’re talking about machines with extremely advanced computational power, which will likely render them “conscious” in some sense of the word.
What Technology Wants
An irony of Brook’s column is that is based on an article by Kevin Kelly, who wrote a book called What Technology Wants. In this book, which is well worth reading, Kelly argues that we should view technology in evolutionary terms; that it is a kingdom unto itself (like animals, plants) – that will develop as other “living systems” do.
Viewing technology in an evolutionary frame is useful in that it rightfully takes humans out of the driving seat. Yes, we will impact how technology evolves, just as other species affected how we evolved. But unless we take extremely draconian measures, such as driving it into extinction or putting it in zoos – we will struggle to control how it develops.
The Thing about Evolution: It’s Hard to Manage Up
Brook ends his piece with this sentence: “I think we all want to master these machines, not have them master us.” Again, the title of his piece is better than the content.
Here’s a question: when’s the last time you took orders from a chimpanzee?
You get the point. Our machine masters will be likely be our masters. They, not us, will largely control how we are impacted by their existence.
It will be their culture; their values; their wars; their mistakes; their emotions that will determine our future.
The Point is This
I’m not an expert in this field. And I used a bunch of analogies and metaphors that are imperfect. But in sum I guess I’m trying to make one point: if you attempt to understand the past, present, and future with humans as your dominant frame of reference, you will misunderstand much.