Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think there are a few theories here:

1. CSAM production may be incentivized by consumers (either because they pay for it or because they serve to encourage the producers).

2. CSAM may (as, famously, with Backpage) be an advertisement for real-world meet ups and further abuse.

3. Having pictures of one’s own abuse further disseminated is itself abusive and morally wrong.

Obviously producers should also be prosecuted, but while I do think the American approach is in many ways informed by puritanical beliefs, there are perfectly rational reasons to pursue the spread of such material.



> as, famously, with Backpage

This characterization of Backpage diverges significantly from the realities of the site uncovered by authorities [1]. Choice excerpt:

> "Information provided to us by [FBI Agent Steve] Vienneau and other members of the Innocence Lost Task Force confirm that, unlike virtually every other website that is used for prostitution and sex trafficking, Backpage is remarkably responsive to law enforcement requests and often takes proactive steps to assist in investigations," wrote Catherine Crisham and Aravind Swaminathan, both assistant U.S. attorneys for the Western District of Washington, in the April 3 memo to Jenny Durkan, now mayor of Seattle and then head federal prosecutor for the district. Vienneau told prosecutors that "on many occasions," Backpage staff proactively sent him "advertisements that appear to contain pictures of juveniles" and that the company was "very cooperative at removing these advertisements at law enforcement's request."

> "Even without a subpoena, in exigent circumstances such as a child rescue situation, Backpage will provide the maximum information and assistance permitted under the law," wrote Crisham and Swaminathan.

1. https://reason.com/2019/08/26/secret-memos-show-the-governme...


Backpage had a staff of ~75 employees dedicated to manually reviewing all adult ads and they were the #1 source of trafficking tips until they were shut down and the owner jailed.

They truly did good work and were an unfortunate casualty of a political pissing contest.


I don't think the theory that consumption drives production holds true when talking about child pornography because the material can be copied infinitely, it's not a finite resource.


> I don't think the theory that consumption drives production holds true when talking about child pornography because the material can be copied infinitely, it's not a finite resource.

Legal porn seems like a strong counterexample.


Exposure to more people might lure more people into becoming buyers; what's what CSAM distribution laws are trying to prevent: showing them to more people.


The link between exposure to visual material and illegal buying is IMO extremely sketchy. I mean, we can’t even get enough people to pay for legal, mainstream, consensual and officially produced porn.

Getting random people to pay for content is a pretty hard thing to do, and I’d assume illegal content wouldn’t just be taking credit cards either, so the barrier is way higher. The conversion rate looks to me to be abysmally low, and we’re trying to police the whole internet for that.


Surely (some) people do pay for legally porn. Otherwise whose paying the costs of production?


Does reason (1) mean that we should disincentivize CSAM production by funding a large, free, publicly accessible and searchable database of CSAM images, to make current CSAM producers unable to compete with free? If we believe the RIAA and MPAA, anyway.

(2) and (3) can be handled by enforcing a delay of, say, 15 years between CSAM being produced and it entering the database (and censoring any identifying information, of course), and by giving people the choice to opt out before their pictures are added.


> (2) and (3) can be handled by enforcing a delay of, say, 15 years between CSAM being produced and it entering the database (and censoring any identifying information, of course), and by giving people the choice to opt out before their pictures are added.

The problem isn't CSAM being in the database, but being disseminated in public. The database is designed to reduce public dissemination.


>here are perfectly rational reasons to pursue the spread of such material.

sure but people should also keep in mind that pursuit could massively backfire, like in the case of backpage, where the shutdown of the site actually made is HARDER to find and recuse children being abused because now the traffickers are using more secure, more underground platforms...


For sure. I have problems with some policy approaches taken here. My point is only that a glib “consumption has nothing to do with production and is thus fine” argument is pretty absurd.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: