Screwed by an Algorithm: How Does One Hold An Algorithm Accountable?
I’ve become aware of an increase in the use of the word “algorithm” – particularly in the context of computers – over the last decade. Although I’m not a digital native, I do live in Silicon Valley’s front yard so, despite my lack of tech-savviness, I can’t help mixing it up with techies wherever I go.
However, I haven’t had a personal relationship with any algorithm – one of these human constructs that seem to be infiltrating our lives and our work in more and more meaningful ways. As I understand it, these formulaic encryptions are influencing more of our decisions for us, decisions we are largely unaware of making.
Algorithms show us the movies we may want to see, the books we might want to read, the food we might want to eat, etc. They narrow the field of choices for us by learning what we have warmed up to in the past and giving us choices that conform to our tastes.
For instance, if we are progressive or liberal they feed us choices that feed us political and philosophical choices toward that leaning; likewise, if we are more conservative. Some say that these algorithms might be driving the widening political divide in the world since each side only sees what it has a liking for.
I have felt pretty unaffected by these ubiquitous mathematical instructions despite warnings about how they might be taking over the world until recently when I had a puzzling experience.
My IT guy was changing my email to Gmail. Within a day of having a new provider, one of my YouTube channels disappeared, taking my two most recent videos with it. One of these videos was my most popular – a four-minute segment from an interview I had done a couple of years ago.
Since Google owns YouTube it was difficult to think of this a pure coincidence. My IT guy made inquiries to Google about why this happened and they could not provide any explanation. They suggested that we contact the YouTube support folks which we did with the same lack of meaningful explanation. This left us with no answer to why or how this happened.
I contacted the videographer who created the videos on the channel YouTube took down and asked her to add them to one of my other channels, which she did after having to repeat some post-production work that had been lost.
So while I felt some comfort in knowing the videos that YouTube took down were not lost, I was left with this niggling feeling of having been wronged. Refusing to go into victim mode over this wrong-doing, I talked with an attorney friend who referred me to his college roommate.
By coincidence, it turned out this attorney represented a client who also had her channel taken down and took YouTube to court. The case went all the way to the California Appeals Court where he lost the case. Apparently part of the user agreement allows YouTube to take down any channels for any reason.
I suppose this makes a certain amount of sense given how people can abuse the system to promote terrorism and other socially unacceptable content, and we are in the midst of sorting out how all this plays out between social media/technology and freedom of expression. But my two little videos had nothing to do with terrorism, exploitation or any other controversial subject.
While this didn’t sit right with me, given that my two little videos had nothing to do with terrorism, exploitation or any other controversial subject, the attorney convinced me that I had no chance of getting my channel back, or any compensation for the loss.
Another piece of the puzzle was that, as near as I can tell, the decision to take down my channel wasn’t made by any person at Google or YouTube. It was made by an algorithm!
It took some time to reconcile this since most of my life I’ve connected wrong-doing with a wrong-doer. But I finally realized that no human being intended to cause me harm, even though I had indeed been harmed psychologically and financially.
So the questions become: How does one hold an algorithm accountable? What is the legal status of an algorithm? Since the U.S. Supreme Court has decided that corporations are people why not algorithms?
I suppose there are new questions to be answered as humanity evolves from the Age of Enlightenment into the “Age of Entanglement,” where even what it means to be intelligent and conscious are being redefined.
Written by John Renesch. Have you read?
# Ranking of the world’s best business schools for 2020.
# Ranking of the world’s best medical schools for 2020.
# Ranking of the world’s best fashion schools for 2020.
# Ranking of the world’s best hospitality and hotel management schools for 2020.
Add CEOWORLD magazine to your Google News feed.
Follow CEOWORLD magazine headlines on: Google News, LinkedIn, Twitter, and Facebook.
Copyright 2024 The CEOWORLD magazine. All rights reserved. This material (and any extract from it) must not be copied, redistributed or placed on any website, without CEOWORLD magazine' prior written consent. For media queries, please contact: info@ceoworld.biz