AI Needs to be Checked
For most of the twentieth century, the path to business success was economies of scale. Technologies of that era, such as cars and trucks, television, electricity, the telephone and those IBM tabulating machines, made it possible to scale an enormous company, and gave rise to mass media, mass markets and mass production. The best businesses were those that could scale up to make the most of the same thing for the most people.
But around the mid-2000s, we started creating technologies that could turn that trend on its head: the smartphone, cloud computing, internet-of-things devices, big data, artificial intelligence. It became increasingly possible to use technology to understand, find and serve small niches of consumers or even individual consumers, and deliver products and services that seem to be built specifically for them. Most people would prefer a bespoke, customized product to a mass-produced product, so unscaled products tend to win vs. old mass-market products if the price is right. What’s unfolding now is the opposite of mass markets and mass production and the opposite of economies of scale. It’s economies of unscale.
This dynamic has been tearing down older, scaled industries and reinventing them. Look at media: network TV aimed at the masses is getting replaced by streaming services that serve you what you want to watch based on your viewing habits. Or education: factory-like schools are being challenged by smart online courses like those from Khan Academy. In healthcare, mass-market medicine is being supplemented by cloud-based services that get to know your health and tailor care to you.
As I’ve watched unscaling cascade through industries, I’ve become concerned. Unscaling’s disruption of the economy, when done crudely, can cost people their jobs and hurt communities. The technologies we are developing are so transformative they could make divisions in society much worse than they already are.
Medical technologies driven by AI, robotics, genomics and gene editing have the potential to open up a biological divide in which the rich get healthier, stronger and smarter while the poor get left further and further behind. Such potential unintended consequences loom large. They could yield a society we don’t want to live in.
AI in particular needs to be checked. When humans build an AI, they can’t help but inject their biases. Then, when the AI is unleashed on the world, it exponentially exacerbates those biases. One example: We’ve seen plenty of stories about how the AI behind facial recognition technology often misidentifies Black people. Such biases can have terrible consequences, as seen in one 2019 shoplifting case in Woodbridge, N.J. A store called police, and the suspect fled, clipping a police car as he sped away. This was caught on camera, and facial-recognition technology identified the man as Nijeer Parks – who had never even been to Woodbridge. Parks was arrested and spent 10 days in jail before the charges were dismissed. As facial recognition spreads, dangerous mistakes like that will multiply.
There’s also widespread concern about AI eating jobs. It’s accepted now that AI will lead to autonomous cars and trucks, leaving millions of truck drivers, limo drivers and bus drivers out of work. That alone will be a monumental challenge for society. But then add to it all the research being done around human longevity. It’s likely that some company will come up with an affordable gene therapy that lets everyone live 20 years longer. That will give us millions of people living longer with no work. That’s a recipe for revolution.
The potential for trouble just keeps growing. We’re already seeing how AI can be used to create “deep fakes” – videos that look real but aren’t. As a warning, a British broadcaster in 2020 made a deep fake of Queen Elizabeth delivering a holiday message. Few could tell it wasn’t the Queen. I have no doubt that deep fakes will be made of politicians or other leaders making them seem to say things that could move people to dangerous acts. We’re also seeing stories of police and military using armed robots and drones to track and even kill criminals and enemies. Add AI to such weaponry, and biases could lead deadly attacks on the wrong people.
If all that worries you, consider that we’re only in the beginnings of the “AI- maker generation.” Tools to create AI will soon become so simple to use that people will build AIs as easily as they now make YouTube videos. The potential for chaos when the world gets filled with amateur AI could be nightmarish.
Finally, I’m concerned about monopoly power. The tech industry has always had superpower companies, like IBM and Microsoft in earlier eras. But we’ve never seen anything like the power and wealth concentrated in Google, Facebook, Apple and Amazon. The trend is toward greater concentration of power in fewer companies, and those companies can stifle innovative startups to preserve their power and impose rules and practices that benefit them, even if they cause harm. Today’s antitrust laws are mostly concerned with protecting consumers from price gouging, which makes little sense at a time when companies like Google and Facebook offer most of their services for free. Antitrust now must protect innovation and business ecosystems.
I will regret it if I help founders build companies that unscale industries and create advanced technologies, and then make the world worse.
Instead, I want to help create conditions so that impact investing and investing for returns are, in fact, the same, investing in climate-change companies, education and healthcare. There are huge opportunities in tackling society’s big problems.
Written by Hemant Taneja.
Have you read?
# Best CEOs In the World Of 2022.
# TOP Citizenship by Investment Programs, 2022.
# Top Residence by Investment Programs, 2022.
# Global Passport Ranking, 2022.
# The World’s Richest People (Top 100 Billionaires, 2022).
Follow CEOWORLD magazine headlines on: Google News, LinkedIn, Twitter, and Facebook.
Thank you for supporting our journalism. Subscribe here.
For media queries, please contact: email@example.com