The Big Data Scare

Posted by & filed under , , , .

uk_flagI came across this article on Tech Crunch today (AI accountability needs action now, say UK MPs) and wanted to share some thoughts on it.

It sounds to me like once again Europe, and UK in this case, is getting cold feet about AI and big data and the interesting results this renders sometimes. Because guess what, occasionally, big data reveals things we do not like and are not in our comfort zone. And I personally think in such cases its wrong for us to pretend we’re blind and just ask for such “biases” (as the UK MPs refer to them) to be simply removed from the system.

I remember for instance, at some point in UK there was an auto insurance company called Sheila’s Wheels — don’t know if they exist but if they don’t you can probably do some research and find out they did actually operate for a few years in UK. Their initial go-to-market strategy was to target female drivers. Why? Well, quite simply because their research, having analyzed a lot of data in the UK (read “big data”), showed that quite simply women were better drivers! (If I remember correctly, their data showed that women drivers in the UK had significantly fewer accidents than their males counterparts. And any insurance company will tell you if you found a segment where data shows consistently a lower chance of accidents, you then lower your premiums and go after that segment of population as that means more money for your insurance business.)

However, the European Union and the UK in their wisdom decided that this business model was discriminatory (to men) and biased and as such it requested that the company extends its range of services to men. The insurance company in their initial negotiations with the government agreed to do this but pointed out that still the men would get a higher rate — because, again, their data indicated they are worse drivers than women, therefore a higher risk to the insurance company. Nope, said the UK MPs, that would still be biased and discriminatory, and requested that the insurance offered the same prince to male and female drivers. Which then meant that the insurance had to put the premiums up for female drivers so they are the same to the male drivers — which I believe at the time made them less attractive to the public and less competitive on the market, since now they had a packaged offered pretty much similar to everyone else.

And all of this because we got scared that the (unbiased!) data told us that you know what, men are worse drivers than women! What’s wrong with that? It’s data analyzed by an algorithm, which isn’t biased, it’s not a living creature with feelings which ends up factoring them in! If anything, the reply from the government was more biased than the original findings! The reply was biased because it is trying to force an un-bias simply because well everything should always be equal and the same for everyone everywhere everytime. If that is really the case then let’s go after the Olympics too, why the heck do they have separate 100m races for women and for men? Why do we accept the fact that the data in that cases shows us that men and women are different?

Getting back to the original article, it annoys me to see that the UK MPs are getting now involved in the big data and AI systems companies worldwide employ for various systems — and I can see the “blanket of equalization” being thrown over tons of systems, which, just as in the previous case, would probably make them undoubtedly mediocre.

I worked in my life on digital marketing and online advertising systems, which requires every single time analysis of large amounts of data. And it’s a common scenario that you are trying to have your system learn constantly from the way it performs and adapt the targeting such that you don’t waste impressions (and money!) on users which are not interested in the advertising message you are displaying. As such, often your system starts learning that certain products do not appeal to men or women, so you end up targeting certain products to the corresponding category automatically.

At this point, I’m guessing our darling UK MPs would call my system biased and insiste on it to be changed so I end up wasting advertiser money where it’s not needed? Why are we getting in the way of these systems, just because sometimes they reveal uncomfortable aspects of the way we are?