What about ’my’ responsibilities?

September 24, 2018

The tech industry is growing at an exponential rate influencing society to the point that we are seeing the biggest shift, perhaps ever, in man-kind. Some tech services actually have billions of users. You read that right, not thousands, not millions, but BILLIONS of human beings using them regularly. It would be arrogant not to say that these services are forming our society and shaping our norms while their only objective was to keep the growth curve… growing.

This has resulted in an increased public focus on the ethics and responsibilities these companies have to their users and society. There’s demand for governments to regulate, a growing concern about user privacy, and concerns within the companies about the ethical choices they make on a daily basis. It seems like there’s one thing we can all agree and that is that there’s reason for concern.

But what about me?

One of the things I’m currently working on is taking personal responsibility over my actions and the choices I make. It’s something that I haven’t always been great at and I still - especially in private situations - struggle to take responsibility and stand up for my own opinion and voice. I too often use phrases like “he/she made me do it” or even “I didn’t know/I don’t know” as a defense.

This got me thinking about our relationship with technology. Sure, tech companies have things to sort out, but what about us - the users? Don’t we have a responsibility to use tech in a more mindful way? Just because you can does that mean you should?

(Please excuse the severe examples I use, but I think they are needed to make my point)

When there’s a drunk driving accident, you don’t see anyone blaming the liquor company. They aren’t blaming the bar. It’s the individual who should have acted more responsibly. The blame for a school shooting doesn’t lie with the gun manufacturer. It belongs to the shooter. In the US, states are even held responsible for not having stricter laws instead of those manufacturers. In Europe, where guns are strictly regulated, the story is the same. It’s the individual that is responsible. It’s the user’s fault.

Technology isn’t bad. If you know what you want in life, technology can help you get it. But if you don’t know what you want in life, it will be all too easy for technology to shape your aims for you and take control of your life. Especially as technology gets better at understanding humans, you might increasingly find yourself serving it, instead of it serving you.
YUVAL NOAH HARARI ON WHAT THE YEAR 2050 HAS IN STORE FOR HUMANKIND

I think this is key because, frankly speaking, most of us have no idea what we want.

Most people hardly know themselves, and when they try to “listen to themselves” they easily become prey to external manipulations. The voice we hear inside our heads was never trustworthy, because it always reflected state propaganda, ideological brainwashing and commercial advertisement, not to mention biochemical bugs.

As biotechnology and machine learning improve, it will become easier to manipulate people’s deepest emotions and desires, and it will become more dangerous than ever to just follow your heart.
YUVAL NOAH HARARI ON WHAT THE YEAR 2050 HAS IN STORE FOR HUMANKIND

As AI evolves and corporations get even more information about their users (e.g. YOU), the possibilities of influencing the choices you make will be even greater. It’s no longer just your friends, family, and peers that influence you - it’s also Amazon, Apple, Google and Baidu.

Personal responsibility

So what to do? Should I just wait for Google to tap into my brain and start making all my choices for me? Nope. I make that choice. You see, just like my aim for greater personal responsibility, you have a choice with it comes to the influence tech has on you.

As long as you are aware that every photo you upload to Facebook gives them more leverage on you. Every website you visit that tracks you (including this one - yeah sorry ‘bout that) gives them a better idea of who you are. Every Google Maps search you make, gives them input on where you are. Every interaction has a consequence.

I think regulation is needed to rebalance the provider/user relationship. I fully support employees that are raising their voices on behalf of the user as that is a great first step. However, I think that we, the creators of these tools, need to take a responsibility of what we add into the (cyber) world. Because the online world isn’t just online anymore - it’s increasingly forming, shaping, and influencing our offline world too. When was the last time you were completely disconnected?

Of course, you might be perfectly happy ceding all authority to the algorithms and trusting them to decide things for you and for the rest of the world. If so, just relax and enjoy the ride. You don’t need to do anything about it. The algorithms will take care of everything. If, however, you want to retain some control of your personal existence and of the future of life, you have to run faster than the algorithms, faster than Amazon and the government, and get to know yourself before they do. To run fast, don’t take much luggage with you. Leave all your illusions behind. They are very heavy.

Why Can Everyone Spot Fake News But YouTube, Facebook And Google?

The companies ask that we take them at their word: We’re trying, but this is hard — we can’t fix this overnight. OK, we get it. But if the tech giants aren’t finding the same misinformation that observers armed with nothing more sophisticated than access to a search bar are in the aftermath of these events, there’s really only one explanation for it: If they can’t see it, they aren’t truly looking.



How hard would it be, for example, to have a team in place reserved exclusively for large-scale breaking news events to do what outside observers have been doing: scan and monitor for clearly misleading conspiratorial content inside its top searches and trending modules?



It’s not a foolproof solution. But it’s something.
WHY CAN EVERYONE SPOT FAKE NEWS BUT YOUTUBE, FACEBOOK AND GOOGLE?

I should reiterate that tech companies need to address this problem more aggressively than they are. The fact that their business model is increasingly dependent on fake news and fake accounts isn’t a realistic sustainable business model, so there’s no excuse for doing nothing. I do, however, think it is odd that we continuously blame tech companies for fake news and trolls when we’re the ones clicking those links. Every time you click a “…you won’t believe what happened next” you need to own what happens next. That’s not the tech company’s problem, it’s yours.

Here’s what I think you can do to serve a better Internet

  • Install an Adblocker. Personally, I prefer Ghostery.
  • Choose iOS over Android. Because of their different business models, Google tracks more of your usage habits and your location in comparison to Apple.
  • Use social media, but think about what you are uploading and if there are other people in the picture, do you have their consent?
  • Think of the sites and links you visit - what is their business model? People love to complain about how the fashion industry creates a world where 45kg (100 pound) girls are the norm, but it’s the readers that feed that business. What about the world that today’s tech industry creates? How are you feeding that business?