Month: March 2015

Advertising, the new Tobacco, joins the Denial Industry

Before you read this post, we want you to understand that 0PII’s ambition is to be (one day) an Advertising provider.

So all that we write here, one day, will backfire on us (in the unlikely scenario that we will be successful).

Hence, if we, a wanna-be Adverting company, pinpoint our own very problem, maybe, just maybe, there is some truth in it.

We bring these (sad) facts, not to criticize the current Advertising Industry, but because we believe that Sincerely Admitting Having a Problem is Half the Solution.

 

Today, it is pretty much accepted fact that smoking is bad for you. But it was not the case many, many years ago. The evidence was gathered during a long period of time, and was actively obstructed by the Tobacco industry.

Today, our generation is facing another malice, the Advertising Industry.

Before we dig into the issues, lets “define: discrimination”

discrimination

1. the unjust or prejudicial treatment of different categories of people or things, especially on the grounds of race, age, or sex.
e.g. “victims of racial discrimination”

2. recognition and understanding of the difference between one thing and another.
e.g. “discrimination between right and wrong”

 

Discrimination, as you see, can be the simple recognition of difference (2). Which can be abused, so there is the negative connotation (1).

We will use in this article the second meaning of the word. Usage of word “discrimination” in this article is not meant in a malicious way.

 

If it would not be for the discriminatory policy of Advertising Engines, then they would not be able to deliver the right Ad to the right user.

Example: If I am interested in Guitars, or if I used to visit online websites about Guitars, or if I did buy Guitars online, then the Advertising Engine can discriminate between myself, a user interested in Guitar (or more generally in music) and those user who could not care less about the latest Android Guitar Tuner. If we have to show an Ad, and we have a number of potential users that receive the Ad, then the Ads Engine will discriminate users against what is know about them.
There is absolutely no intention of malice in this kind of discrimination. It can help users and business find each other, each getting a (beneficial) service.

But things are not always that simple, and some companies have vast access to our data, to the point that the Advertising Industry not only knows that I am interested in guitars, but it has hundreds/thousands of interesting data points about myself.
Those data-points can be correlated with data-points, many of them not present in my explicit user profile. Even if my religion, race, age, sexual orientation is not offered by me on my Facebook account, it can be determines with high accuracy from Likes and other activity. Regular users have little understanding just how much it can be inferred about them.

The data-points that are supposedly protected against discrimination, can be provided willingly by the user, or inferred. The ability to bias against them is dangerous territory. It is pretty much a clear cut, that in some industries you are simply not allowed to make decisions of hiring, firing, giving loans, … based on these protected user feature.

But, even more dangerously, your cloud service provider, from your ISP to Facebook, could determine what you are actively hiding about yourself in public.

2 years ago, Latanya Sweeney, a professor of government at Harvard University, found out that Google Ads is biasing Ads based on race. Big surprise? Not for us. We must totally admit that such biasing is a totally reasonable by product of Ad targeting.

But here are 2 important quotes from Google regarding “supposedly” racial bias:

AdWords does not conduct any racial profiling,” said Google, adding the company’s policies prohibit advertisements “that advocate against an organization, person or group of people. It is up to individual advertisers to decide which keywords they want to choose to trigger their ads.” – and we agree, Google would have a hard time finding a smartass to actively conduct racial profiling, not even Google is that stupid to put a human in charge of such a monstrous liability.

“Since we don’t know the reason for it,” she said, “it’s hard to say what you need to do.”

Especially the second quote make us cringe. We are very troubled with what we read, this is not what we want to hear from Google. If Google does not know why their Engines might act like conducting racial profiling, then they should FIND OUT! Google has the best people and best Ads/Search Engines in the industry! No excuses please!

On top of that, even two years after this event, Google’s own employees, who would love to see more racial bias investigations inside other institutions, still think that allegations of racial bias in Google products are just POV.

In this simple public statement Google is admitting the simple fact that, Google does not want to know. If they would want to know, then they would find out why there is evidence of racial biasing in their products.

 

But they would rather keep their eyes closed and not see the obvious.

 

Let’s come back and talk about discrimination. Discrimination is the basis of Ads Targeting. If it would not be for the Ads Engine ability to discriminate between users, the Ads Engine would not be able to target an Ad to a specific audience.

 

What Ads industry wants (including ourselves) is that our discrimination is only based on user interest and user own disclosed “features” that s/he is comfortable with.

 

But hope is not a strategy, and there are shadows of Demons lurking around.

 

See, this benign targeting is based on user features that, when collaborated, reveal other features. This can be actively mined, just like Netflix does when it recommends you a movie, it discovers that you might like a movie collaborating other movies that you like or dislike . The technology has come around a long way in the last decade, just see Netflix 1 million dollar prize.

In the same way Ads Engines can discover user features from features not present in user profile.

Let us give a very very simple example. If a computer is used to visit a porn website, there is about a 80% chance that the user is a male. If in addition we now that user visited a men clothing site, then we have a second (independent) data point which suggests to a chance of about 60% for the user to be male. From these two points we established with 82+% chance that the user is male, greater than the 80% and 60% we had before).

If one has access to many independent datapoints, which can be very weak BTW,  in 50-55% range, then they can be collaborated to determine with 90%+ chance your race, ethnicity, religion, gender and sexual orientation and more.

 

And this work can be done actively by Data Scientist or it can be mined by Artificial Intelligence without our active involvement.

 

We have laws against banks on issuing preferential loans based on racial/religious “features” of a person, and against hiring people based on their race, but all these attributes can be mined actively by unscrupulous banks and employers. And catching them will be incredibly hard. The bank will argue, that this is what people responded to the Ad. And the employer will argue similarly that these are the people that applied for the job.

 

If we close the eyes to the problem we might enable unscrupulous business to discriminate against protected features.

 

But even more dangerous, AI itself, without our knowledge, can discriminate against protected features, just for the simple reason that it maximizes profit. Good luck looking inside of the AI brain for the real reasons.

 

Regardless of the reasons for which low quality, customer-ripping Ads are assigned to minorities, as FTC found evidence of, we have to get your hands on large enough data sets in order to understand the problem.

 

We should be good stewards and protect users. That is why we have to open our eyes. We have to sincerely admit we have a problem, and deal with it.

 

But Advertising Industry, just like the Tobacco industry during its prime, has decided to join Denial Industry, and refuses to look into the problem.