This subject came to my attention recently following a few tweets from Dierk König around this — see this tweet for instance. To summarize, there’s an ongoing discussion on social media channels (and not only!) around the fear that technology we (and by “we” I am referring to us, techies) create lands in the lap of institutions and governments which then turn it against us. One such technology is the whole “big data” ecosystem which as we know by now, amongst many other things, it is used by governmental agencies to mine data about individuals.
As such, one of the ideas vehiculated around is that it is us, techies, who enabled this — and, I guess, following that trail of thought, really we should not associate ourselves with such projects. In other words, let’s stop contributing to the likes of Hadoop, Cassandra, Scalding … and the list can go on for ever… since by doing so, we’re only allowing above-mentioned agencies to, well, spy on us even more!
I understand the concern, and I am infuriated too by the liberties taken by these 3rd parties against individuals’ freedom — however, I don’t think this to be the way to stop it or prevent it. Unfortunate as it may be, but every bit of groundbreaking technology we came up with, was used for military or otherwise not-so-orthodox purposes. Looking back in ancient history, once we discovered metal casting we went right ahead and made better weapons, even though that things like bronze casting meant in the first place creating better tools and progressing humanity to the next step. (And we did progress to the next step, but in the process made some sharper weapons and killed some less fortunate mothertruckers in the process 🙂 )
I thought a good example for this is also Alfred Nobel — as much as he caused sorrow in his own family and blew up his house a few times, his intentions were not evil and to my understanding he was avidly seeking for the “tool” which would advance humankind to the next level. And he did — but once again, that came with a price! Should he have taken the stance of not continue with his research because it’s going to be turned against humanity, arguably we’d still be digging by hand everywhere and our construction of anything would take forever. Though just as arguably we wouldn’t have had all the wars and deaths caused by his dynamite either!
Whether that was a fair price to pay for that discovery or not, that’s different — and I feel this is where the discussion should move on this subject: we shouldn’t have to feel we need to abandon our “digging”, research, projects for the fear of them being turned into evil weapons — instead, perhaps we should draw attention right away to what the implications of “releasing into the wild” our projects will be. Raising awareness — as Dierk himself suggested in the discussion I had with him on Twitter — is most likely the way to go: one can argue that having everyone (or a large majority) aware of the implications of implementing things like “big data” at governmental level would trigger a vote against such measures or perhaps a very careful and well monitored implementation of it such that individual freedom and privacy is not at risk.
Very few are aware for instance that implementing “big data” over a wide segment of population allows us to transform stereotypes into, well, “results of an algorithm” — for instance, such thing would tell us that most Eastern Europeans are hackers based on the fact that aggregating large quantities of data from various electronic attacks would reveal the origin IP addresses to be originating in that part of the world. Armed now with this “fact” (which would be communicated to us by an algorithm which has no feelings, no bias, no knowledge of any pre-existing stereotypes and which crunched huge amounts of data to come to this conclusion) we have now absolutely no issue in rejecting right away any resume for a job in our company for an individual from that part of the world: who would want a hacker in their company? And it’s not something we made up — it’s something an algorithm told us, therefore it is correct and it is true! I’ve chosen Eastern Europe and hacking being from that part of the world and being aware of some of the stereotypes casted upon us 🙂 but you can envisage any other stereotype emerging as a similar result of an algorithm running against “big data”!
Does the above mean that the algorithm is evil? Quite likely, no — however, the way in which it was used can definitely be categorized as such! Is the fault with the developers? Definitely not, I’d argue — the same algorithm which came up with that conclusion can be run against seismographic data and quite likely predict the next tsunami in time to allow people to retreat to safety and avoid the ordeal. However, people who have run the algorithm are at fault: whether it was their (evil) intention to run it in such way or it was by negligence and/or ignorance, they have processed the data in a wrong way and while the conclusion they arrive to might be valid, it was reached by illegal (or at least immoral) means! And again, raising awareness against this, could have prevented that — no one would allow such algorithms to run against our own personal data, knowing ahead what the implications are, right?
So I don’t think stopping developing technologies is the way to go — but raising awareness and ultimately changing slowly and surely our ways as a (auto-distructive) species on this planet is. Keep up the good work on “big data” guys I say, but please think of the side effects and communicate them loudly. To everyone.