Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'd argue there is no way to properly communicate to the average facebook user how their data is being collected and used in a way that is transparent but not confusing.

For example, explain to someone who is illiterate in technology how the act of you "tagging" your friend in a photo is to offload image labeling work to train a deep neural network to infer your friend's face.

If you radically simplify the issue in line with GDPR by saying something like:

"Whenever you tag a friend in a photo you to help teach our computers to recognize what your friends face looks like"

It makes it seem way more terminator/ominous than it is to the average person.

Ok now do the same thing with all of the nlp, voice etc... data points.

I just don't see how facebook is going to deploy a worldwide education effort on big data effectively.



It makes it seem way more terminator/ominous than it is to the average person.

Personally, I think it's actually pretty ominous. Prompting people to consider what the consequences of their FB actions are can't be bad - I bet a lot of people wouldn't tag their friends if they knew what it was doing and really thought about the implications.


I think you need to step back and ask why that sounds more "terminator/ominous" to people. And from there, maybe re-examine why you're doing it.

Just because someone thinks that it might sound bad is not a reason to not disclose it. If people think that FB image tagging training their systems to recognize people's faces is bad, that's probably a signal that shouldn't be ignored.


> "Whenever you tag a friend in a photo you to help teach our computers to recognize what your friends face looks like"

That's a good simple explanation IMO. If it sounds scary, maybe it is.


It's scary to some people but not others, so is the purpose of GDPR to spur a discussion on the scope of technology and privacy tradeoffs or to actively slow the pace of personal data collection?

I think there has been a lack of reasonable and measured discussion about this issue, it's very polarized as with most things.


Just in looking at ambiguous and deceptive labeling in the grocery stores (US), I am seeing what seems like a loosening of ethical norms. I can't quantify it, but I am feeling like the ideology that regulations are always bad and the market can be trusted to maintain good quality products is giving people license to try anything that is technically legal to make incrementally more money. Despite the vast number of regulations that exist, I think people are identifying loopholes in both the law and human psychology at an ever increasing rate, and regulations that exist are inadequate.

This doesn't mean, of course, that more regulations can fix things, but I think the world is changing, possibly for the worse, while some people say we should remain calm and do nothing, because nothing unusual is happening.

Edit: I am not suggesting people are becoming less ethical than in the past "just because" - I'm suggesting information technology is letting smart people increasingly subvert norms about transparency, because once you can quantify the effect of your customers' cognitive biases, competition makes it imperative to exploit them. Even if you don't realize what you're doing, you do enough A/B testing and it's automatic, I should think.


I agree that it is ominous, and you're describing a benefit of the GDPR.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: