Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Maybe we need to poison the well.

Create a botnet that downloads CSAM to millions of devices so that millions of reports are generated for innocent people so that it’s clear that all of this doesn’t work as intended.



That would be good (as long as people can't actually see the CSAM, perhaps find a hash collision), but I wouldn't want to be the one who ends up in solitary for life for taking one for the team.


Don't download actual CSAM, that's too questionable of a proposition. Just figure out how to manipulate innocuous images such that they trigger Apple's perceptual hashing algorithm. Then flood the web with those. If the false positives massively outweigh the true positives then eventually people will stop paying attention.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: