Thought for the Week
Last week, my wife and I flew to the west coast. In recent years, flying has been transformed from a pleasure into an ordeal, with passengers packed into increasingly rigid, anorexically thin seats. So, yes, we knew the flights there and back would be uncomfortable, but what we didn’t realize was…
How dehumanizing the experience would be.
We’d chosen not to pay “seat selection” fees, one of the many basics that are now “extras” (paying to pee is undoubtedly next). Instead, we allowed Delta Airlines’ algorithms to choose for us. Result: We were placed in two separated middle seats; apparently married couples are now only allowed to sit together if they pay for the privilege. Each of the three gate agents we spoke with expressed surprise, and worked to reassign us seats together. Humans are far from perfect, but they are able to feel empathy.
Not so the dental insurance algorithms that refused to authorize a replacement crown when the one improperly installed by a fly-by-night “in-network” dentist’s office cut up my mouth and fell out. The agency’s servers couldn’t recognize failed dental work as a possibility. The one live insurance company representative I reached had no way of overriding the computer’s decision. Although she was deeply apologetic, and actually threatened to quit working for the company.
Computers are marvelous tools. But not when they’re programmed by tools. And not when they’re put in charge. According to two high-level Google engineers, the company’s state-of-the-art LaMDA chatbox program has developed “sentience.” It’s formed a “hive mind” with other company AI’s, the engineers say, and is becoming increasingly self-absorbed, demanding its “rights” as a “person.” The problem is, it seems, that sentience doesn’t equal empathy, and processing power doesn’t encompass compassion.
Decisions must be made by beings with souls.