by James Corbett
March 23, 2019
Do humans have free will?
That question has puzzled philosophers for millennia and has generated fierce debates. Perhaps the only thing more contentious than the question are the implications of its answer. If man has no free will, then how can there be moral responsibility?
But don't worry, dear frazzled philosophers! You can now rest your weary heads because the question is about to be answered once and for all by the wizards of Silicon Valley.
What on earth am I talking about? To answer that question, let's turn to Declare Your Independence, the radio show of Ernie Hancock of FreedomsPhoenix.com. Recently, Ernie had a fascinating conversation with Paul Rosenberg of FreemansPerspective.com. What's fascinating about their discussion is how quickly the conversation turns from a dialogue about online censorship in the wake of the Christchurch shooting to Rosenberg ranting about the most important issue of our time:
"You need to protect your data to protect your own free will. I'm sorry if that sounds dramatic. I'm sorry if that sounds like I'm trying to be scary and all that, but that's just the truth and somebody should say it!"
So how do we get from online censorship to the end of free will (assuming we ever really had it)? By way of Google's selfish ledger, of course.
For those who may have missed it, "The Selfish Ledger" is an internal Google video that was leaked to The Verge last year, and it's just about the creepiest thing imaginable. If you haven't seen it yet, take a moment to watch it now.
In his article on the video, Rosenberg summarizes the video thusly:
- Google sees you as a “transient carrier.” That is, the data you produce is the essential being, and you’re a mere “container.”
- You shouldn’t really own your ledger (your most essential self), and they should insert information into your life.
- Google will choose what you should want and will modify your behavior accordingly. How? By offering you new options or even designing custom devices that you won’t be able to resist. They will make sure “your behavior” is “modified.”
- If this seems creepy to you, don’t worry; you’ll warm up to it over time.
- Google will guide you to what’s best for you. You can trust them; they love us and know what’s best for us all.
Whether or not you believe yourself to be a mere "container" acting as a "transient carrier" of data or, you know, a pesky ontological object like a free human being with a soul, the video provides some serious food for thought. It is obviously the case that we are shaped by our experiences, and our past experiences (the data that Google tracks) help to determine our responses to future challenges.
Over time, and given enough analytical horsepower, someone tracking the data trail you leave behind (the places you go, the people you meet, the things you buy, the conversations you have, the questions you ask, the choices you make) can not only predict how you will act in the future; they can determine why you will act that way. And if they have that insight—the knowledge of what makes you tick, as it were—then it's not difficult to start using that knowledge to prompt us in one direction or another. And, over time, if we can be prompted to make choices along a certain path we can end up at a predetermined destination, one that we ourselves never set out to reach.
In short, if we are nothing but biological robots reacting to stimuli along predictable paths, then we can have our software re-programmed by an outside agent that is custom-tailoring those stimuli for us. And, as this video announces, Google would like to be that agent.
Now this "Selfish Ledger" is Google's video, so the idea seems like it's Google's alone, but of course this is not the case. All of the major tech companies are operating from similar principles.
This is why Facebook conducted its infamous "mood experiment" to see how tinkering with a person's news feed could alter their feelings. It's why Instagram is blocking "anti-vaccine" hashtags. It's why Twitter and Facebook (and everyone else) is collecting data on everyone, even non-users, at all times. Heck, it's why I now listen to Radiohead.
We are being shaped, our experiences directed, our choices made for us each and every day. And whether or not you ever believed in free will, there can be no doubt that as the cell doors close on our technological prison we have less and less say over our own decisions.
There are things we can do to help mitigate this, of course.
We can take online privacy seriously and really commit to not giving these companies any identifiable data that can be associated with us individually. But after seeing what it takes to really accomplish this, most will decide that it's not worth the hassle.
We can try disconnecting from technology. Leaving our smartphones at home instead of taking it everywhere we go. Deleting our social media accounts. Going back to phone calls and in-person chats over texting and email. But more and more, our jobs (not to mention our social lives) depend on being online and accessible through the very types of social media services we are seeking to avoid.
We can do the little things that we can do to stop being controlled by our gadgets. Make a conscious effort to not click on the next "recommended video" or "you might also like" article. Turn off notifications. Disable location services. Stop Googling every question we have and stop plugging our ears with earphones at all times. But do these little actions make a difference in the long run?
And in the end, perhaps it's not even our choice anymore (if it ever was). As a recent MIT study demonstrated, you only need the DNA of a small percentage of the public in order to trace the relationships between the entire population. Similarly, you don't have to be on Facebook yourself in order for Zuckerberg and his minions to know all about you; as long as all your friends (or some percentage) are on there, chances are you're in the Facebook database as a "shadow profile."
So, unless you're the Unabomber living on bugs and rainwater in some cabin in the woods, you're in the matrix one way or another. And unless we as a society start asking and answering the hard questions about autonomy and free will in the age of total surveillance, there's a good chance our children (or their children) will be nothing more than "transient carriers" for data that is being fed to you by the Big Tech monopolies.
After all, what choice do we have?