Humanity Should Never become a Slave to Big Data

in #big-data7 years ago

Today,a staggering proportion of the world's decision-making is done with the help of Big Data. When the world went from the "Website Age" to the "App Age", one big change happened. A change that most people never saw, amidst the other changes in their lives; that was the rise of data mining. There has never been a period in history when Humanity has known so much about it self. Large multinational tech companies hold petabytes upon petabytes of data on how we look, what we eat, and what our opinions and habits are. And they use that data to produce insights on what the next thing that we are going to buy is. Of course, they then sell this intelligence to others. Everything from jobs to social media to military decisions, are being decided using algorithms that use Big Data.But amidst all the craze, we may have neglected the dark side of letting Big Data steer the world.


Private companies hold enormous power over our lives

Since most decisions today are made with the help of Big Data, the companies that design the algorithms hold an enormous amount of power. Most of these algorithms are proprietary. Many cases have emerged where governments used Big Data algorithms to make decisions. Then, the decisions were questioned. In the ensuing investigation, it was found out that no one in the entire government had access to the algorithm used to make that decision. Judicial systems in many countries have started to use data about people's lives to make judgements about whether they deserve bail or not. And again, in this case to, there is no way to dig into the algorithm that has such a strong influence on people's lives.

Another aspect of this is the data used to train these models. In a lot of cases, there are biases in how the data is collected,we don't whether the data was normalised for weeding out these biases or not.

Now you might say, normalising bias is Machine Learning 101. "Of course they normalise their training data, otherwise there algorithm wouldn't produce objective results". But that's assuming that the intention of the company is to make an algorithm that produces objective results, which isn't always the case. Why might someone make an algorithm that produces the wrong results, you might ask. Because the core intent of the companies making these algorithms is not to make the right decision, but to maximise profit for the company, And as you can imagine, there are conflicts of interest in all kinds of places.

Let's say that a company is making an algorithm for hiring people based on their educational qualifications, and the company that will use this algorithm will pay for every person that is hired at the end of the process. Well, I think you can imagine what would happen in this case. The algorithm will be very lax in reviewing people, and will just hire everyone in order to maximise profit. Then years later, the company finds out why it's filled with non-performing people. They try to resolve the issue, but of course, because the whole process around using this algorithm is opaque and many of the important details are proprietary, the issue never gets resolved.

Now of course, this was a simple example with a simple solution. But in the real world, things might not be that simple. And when you have so much secrecy and opacity around how the algorithm actually works, you might not know for years, that you are making the wrong decisions.

There is no mechanism for appeal

Perhaps the one thing that makes these processes seem truly ridiculous is the fact that when you get rejected or convicted, you often have no way to appeal that decision. There is this aura of invincibility surrounding these algorithms which frequently are anything but objective. Think about it. There could be millions of bad decisions being made in a single day, a lot of which would have been rectified if there was a functioning appeals process.

So a decision maker that is cloaked in secrecy, that often has self-serving intentions, whose decisions are final with no way to appeal them. Is that medieval enough for you, or should we also restart the witch hunts?

Big Data is not a way to magically make objective decisions. It just automates the status-quo. And if there is no oversight over these algorithms, we will become slaves to Big Data without even knowing it.


Thank you for reading this post. If you liked it and want more, follow me :-)


Sources

  • Header image from Pixabay.

  • Awesome Footer animation by @malos10 :-)

Sort:  

nice

Congratulations! This post has been upvoted from the communal account, @minnowsupport, by harshal from the Minnow Support Project. It's a witness project run by aggroed, ausbitbank, teamsteem, theprophet0, someguy123, neoxian, followbtcnews/crimsonclad, and netuoso. The goal is to help Steemit grow by supporting Minnows and creating a social network. Please find us in the Peace, Abundance, and Liberty Network (PALnet) Discord Channel. It's a completely public and open space to all members of the Steemit community who voluntarily choose to be there.

This post has received a 1.04 % upvote from @drotto thanks to: @banjo.

Congratulations @harshallele! You have completed some achievement on Steemit and have been rewarded with new badge(s) :

Award for the total payout received

Click on any badge to view your own Board of Honor on SteemitBoard.
For more information about SteemitBoard, click here

If you no longer want to receive notifications, reply to this comment with the word STOP

By upvoting this notification, you can help all Steemit users. Learn how here!

Coin Marketplace

STEEM 0.18
TRX 0.16
JST 0.031
BTC 63047.55
ETH 2690.45
USDT 1.00
SBD 2.54