> On September 5th 2018, Google announced that it would no longer share encrypted cookie IDs in bid requests with buyers in its Authorized Buyers marketplace, “as part of our ongoing commitment to user privacy”. Mr Ryan’s analysis also found that Google continued to share these with ad firms.
But now Google shares not cookies, but unique URLs. Look, they kept their promise.
I too have learned not to believe in government solutions to problems.
As a result I stopped searching for... Well unfortunately I'm one of those, "nothing to hide" people.
Life as usual.
If I needed to use google without anyone knowing, either use a public computer, or a stack of layers to remove IP, browser id, cookies, etc... Unique URLs would not trace far.
Just for the record, google is not in compliance. The law is not a switch statement where you just need to find that magic missing “break” and then the system is pwned. Justice is not a mathematical equation.
I don't understand why there is so much push back against the nebulous threats like targeted advertising. Advertising is unquestionably here to stay and it's hard to believe that people would rather have irrelevant ads to relevant ads. I'd rather Google figure out what type of TV shows/movies I might care about and only advertise those to me.
And if they want to aggregate large amounts of data about internet browsing habits to get a better idea of the types of customers which want certain things, I see that as only a positive since it helps steer businesses in the service of their customers. Data mining large sets of browsing history does this in an incredibly non-intrusive way.
I'm all behind auditing the data being used and sold for identifying features, but the cases where this has been a real and serious problem are sparse at best. The most serious threats have been successful hacking attempts releasing data which isn't controversial for them to collect.
That is, people are worried about Google telling Capital One your IP address is in the market for a credit card while hackers are walking out the back door with hundreds of thousands of social security numbers, names, addresses, and credit scores.
"Advertising is unquestionably here to stay"
Often interesting hearing people use "unquestionably" meaning that no one believes differently, rather than a more apt word meaning you personally feel strongly about a certain view point and and don't know other people's point of view and not concerned with it. You also mentioned you find it hard to believe people would prefer irrelevant ads over relevant ads, but of course many people do (since if one has irrelevant ads they will be less likely to engage with them making them less profitable for the advertiser and in the long run making that form of advertising less viable), so sounds like you may want to ask more questions to learn other people's views so that you may either update your views or be able to make stronger arguments for your own. IMHO.
Google owns hoards of personal data, including yours and mine. It’s nearly impossible to avoid, sans living life offline. Now that google has shown they are willing to share your personal information with advertisers without your permission, what do you think will happen when google becomes the arbiter of medical (and persoanal) information to health insurance companies? google will win. health insurance companies will win. individuals and families will suffer.
I think it's important to be aware of conflicts of interest like this, but that's not what's being discussed, and the solutions offered aren't attacking this type of problem.
No, no I do not think so. I am sure many people thought that way about smoking, yes I can still opt into smoking if I choose to but I am no longer chronically exposed to second hand smoke, and everyone knows it is harmful and pushed by greedy sociopaths taking advantage of our biological quirks.
> it's hard to believe that people would rather have irrelevant ads to relevant ads
I usually prefer irrelevant ads.
It's possible for ads to fulfill their idealized role, where they inform me about an opportunity for a mutually-beneficial purchase. But in practice, I view most advertising as basically adversarial; most ads want every viewer to buy, whether or not they will actually benefit, and so they're incentivized to manipulate me. Fundamentally, everything beyond "this product with these traits exists at this price" tends to be manipulation. Grabbing attention, building brand associations, creating demand, spreading FUD - these are all attempts to extract money from some marginal buyer who won't benefit, but can be tricked into thinking otherwise.
Targeted advertising doesn't change this. It does raise the percentage of mutually beneficial ads, and allows ads to get attention with relevance instead of distraction. I'm grateful for those things. But it also helps advertisers build models with which to do a better job of manipulation. My targeting preferences are basically polar; "random billboard" and "recommendation from a close friend" are both preferable to "some targeting".
(This is not actually the core objection, though, which is that I don't trust the firms behind the targeted advertising. Even when targeted ads are better than untargeted, I expect to gain much less than I lose from the advertiser using/abusing/losing my data.)
> if they want to aggregate large amounts of data about internet browsing habits to get a better idea of the types of customers which want certain things, I see that as only a positive since it helps steer businesses in the service of their customers
Again, this seems like an incredibly optimistic view of what businesses do with consumer data.
If Ford wants to aggregate Car & Driver browsing data to decide which features to put in their new model, I agree that this probably benefits consumers. But if United sees that people who search for similar flights multiple times are more likely to book, and then charges me higher ticket prices when I do so, the targeting has been purely harmful to me.
We don't have to resort to hypotheticals like being denied health insurance based on search data. It's already the case that browsing data is not just used to make aggregate decisions, but combined with tracking in ways that consistently harm consumers.
> That is, people are worried about Google telling Capital One your IP address is in the market for a credit card while hackers are walking out the back door with hundreds of thousands of social security numbers, names, addresses, and credit scores.
My fear is that Google will tell Capital One my IP is in the market for a credit card, they'll talk to Experian to see if I'm a good bet, and then hackers will have that much more information when Experian loses all of my personal data. Which they did. Or that when Capital One gets hacked, they'll have not just my SSN but a history of IP addresses linked to me, which can be geolocated to beat the "which of these addresses have you lived at" challenges which my state foolishly uses during identity verification.
This is my fundamental problem with targeted advertising. It's not about what ads I see, or even about what Google knows. It's that every time personal information spreads, it worsens the risks of all the other times it's spread. Huge amounts of data are already gone, and we can't get them back, and that's a major problem. But it's already standard practice for people selling stolen info to download new breaches and correlate that information to create a more valuable product. Tossing personal info into the seedy world of ad networks makes all the other problems a little bit worse.
It is fault of advertisement industry and RTB. RTB rely on personal data with some fields missing (name, phone, email, etc) to be send for real time bidding for Ads by many players. Remaining data mapped by these players to infer these missing fields and guess purchase habit of person browsing the site and bid on Ad space.
Data sent to these RTBs do have unique fields like Android Ad ID that can easily identify person.
Ad industry rely on these obfuscation like Android Ad ID to track us individually but giving the idea that personal data is not sent.
Two levels of things here: Google/Doubleclick and the GDPR and then this particular "push page" issue.
- GDPR instructs that an organization maintain control of people's data. "Control" in the legal sense of it where you know who is getting the data, what their data handling practices are, how it's going to be used, etc.
- Real Time Ad bidding is the process where you land on a site and then that site has a JS snippet which passes your information (it can be more than this, but lets just say your IP address) to an Ad Auction where advertisers bid on showing you an ad. They do geolocation, company lookups, check retargeting cookies, etc. and then whoever offers the most money to show you an ad gets their banner on the page you are looking at.
- Google's Doubleclick is one of the largest Real Time Bidding setups.
Side Note: a frequent comment I see on HN is that "IPs are not personally identifiable" which is true in the abstract, but they are identifiable enough that advertisers are willing to spend significant amounts of money on ad bids.
You may immediately see the problem here: it's impossible for you (the person browsing) to give meaningful consent to share your information with all these companies participating in the ad auction because the site you just browsed to has zero flipping idea who they are and in any case there is a constantly churning audience of literally tens of thousands of companies participating in these auctions.
All of that is the context then for this newest development: push pages.
Google/Doubleclick is attempting a bunch of different approaches to deal with the fact that the GDPR shatters the current privacy destroying setup of anytime you land on a site with their JS include that your information is sent to several thousand companies unknown to you.
They're trying to implement psuedo anonymous identifiers [1] they're trying pull back on cross site matching, etc. all of which hurts their bottom line. So what this "push pages" looks like is an attempt at a technical workaround to some of the legislative hurdles raised by the GDPR. By moving the JS+Cookie setting to the Google domain they're able to say (in some context) that it's a 1st party cookie and not a tracker and able to do more sophisticated matching.
Side Note: another frequent comment I see on HN about the GDPR is that "It's too vague, why doesn't it just say what I can and can't technically implement" and this is exactly why: if the law is laid out in technical terms instead of intentions and actions it's easy to find loopholes (aka moving some aspect of tracking from a site to Google's page).
Your ISP is the only one able to reliably identify you by your IP. And even then they would not be allowed to use that info in other contexts (e.g. advertising) according to GDPR IIRC.
GDPR makes it clear that uid or pseudo-ids that can be mapped to an individual are PII. This includes mapping two or more seemingly independent fields to de-anonyimze as well.
[1] https://brave.com/google-gdpr-workaround/