A couple of years back, Jay Ruskey, developer of the “Stak Block” — a wonderful way to reduce a huge source of worldwide carbon emissions by turning waste rice straw into building blocks (BTW, someone needs to go and make a few million dollars selling this to the Indian government) — told me of a conversation he had when presenting his technology in China. From the US perspective, a major selling point was that the machinery they had developed, and the newer machinery that was under development, would increasingly reduce the manpower needed in the production of Stak Blocks. The Chinese representative protested, saying, “No, no, we want more people needed to do it, not less."
Where the US view was looking to more efficiency and profits, the Chinese view was looking to social stability.
Last week I attended a talk given by General Stanley McChrystal at the Westmont University President’s breakfast. It was a thought-provoking talk.
The General touched on a fact that we are beginning to see play out more and more: that technology, in various ways, is taking over jobs traditionally done by people. It is a known fact that most jobs that have been lost in the US have not been lost to off-shoring, but to robots, efficiency and technology.
And it’s worth noting that these changes are only just beginning to pick up steam.
For instance, think of any job that involves driving: truck drivers, bus drivers, train drivers, etc. Estimates are that those jobs are all going away, perhaps within ten years. (That is approximately 6 million truck and bus drivers.)
I think many people (yours truly included) have found refuge in the idea that their job could not easily be done by a robot (or more correctly, Artificial Intelligence).
But what happens when we are the robots? Because I believe that is where we are heading.
THE FOLLOWING IS NOT A POLITICAL STATEMENT. (How’s that for the sign of the times!)
So our President made a speech to Congress recently. I think everyone agreed it was his most balanced, reasoned speech to date. Some of his previously most ardent fans worried that he was wavering on important issues; some of his bitterest opponents found hope in his words. (That may be the bell-weather for a good speech! :) What I found amazing is that the next day Van Jones completely reversed his opinion of Donald Trump based on that speech. I wanted to ask him, “So your recommendation is that our opinion of people should fluctuate based on how they acted for an hour yesterday, not on what they did for the previous year?”
What was glaring to me was that, in this “Kardashian” world of ours, we are more and more open to having our views and opinions moulded for us, and that the timeframe for our malleability is becoming very short.
The warm up act has been a general acceptance of product malfunctions in our current “technology age” that was never accepted in the engineering age that preceded it. Witness the almost blasé acceptance of fatal shortcomings in the Tesla self-driving technology vs. the backlash that was the exploding Ford Pinto or the crashing Chevrolet Corvair. And as AI gets more involved in our lives, with algorithms deciding which Google results pop up and what movies, songs or clothes are recommended for us, one might legitimately ask, “Who is in the driving seat, me or the AI?”
Is AI moving toward us, or are we moving toward it?
In the world of starlings (birds), they call it “murmuration” — when a flock of starlings change direction as one. They are not thinking the course changes; it’s an automatic subconscious process. Similarly, our technology is encouraging us to hand over more and more of our cogent work to be computed external to our own brains. I know Netflix says their recommendations are based on our preferences, but are they? And is that all they are based on? — My son tells me that Dr. Dre is regularly in his “recommended” list on Apple Music. Yet he has never listened to Dre on Apple; but Dre is a business partner of Apple…
Remember, a small rudder can turn around a super-tanker, given enough time, and the change of direction is so gradual that in the open-sea you might not notice that it happened. Yet you end up chugging in the opposite direction to your previous course.
Do you really know what “being you” means? Because if you don’t, then how could you possibly know whether “who you are” is being manipulated?
In 20 years, I probably won’t need to influence you directly, I will just need to influence the person in charge of the AI that influences you — if it even is a “person” at that point…
Some say that in 50 years, most jobs will be gone and the “basic living wage” will be enacted. And most of our choices will be made for us. At which point the question will be:
What is the point?
And I must admit that there, I’m rather lost for an answer.