Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So this should be it for trying to regulate stable diffusion type tech, right? If these models and their inference infra can be shrunk down to be runnable on a PS2, it doesn't seem like it's possible to stop this tech without a totalitarian surveillance state (and barely even then!).


The war on general computing has been ongoing but not made enough inroads to stop people from owning general computing devices (yet)


Indeed, the death knell could be tolling not for regulation of ai but for general purposes computers. In AI we have four horsemen: copyright infringement, illegal pornography, fake news generation, and democratization of capabilities that large companies would rather monetize.


Given the proliferation of illegal downloads (I can get a bad cam rip of the Barbie movie on release weekend just fine, plus a VPN would protect me from DCMA takedowns), and illegal pornography (just ask a torrent tracker for the fappening), and the proliferation of fake news (esp on eg, Facebook) despite a lack of it needing to be ML model generated, and companies and OSS in the space doing the democratizing and releasing complete model weights, and not just lone individuals trying to do the work in isolation, (aka stability.ai), are they really four horsemen, or four kids on miniature ponys?


I try to bring up as often as possible in conversation that nearly all the progress we're seeing in terms of usability and performance is precisely because of the open source support for these models.

Especially because these tools are so popular outside of the developer community, I think it's worth really beating into peoples minds that without open source AI would be in a much worse place overall.


This is more than a little melodramatic.

https://frame.work/ and the https://mntre.com/ MNT Reform: Exist


If my country decides to ban the ownership of general purpose computers for individual persons, they would order the customs service to stop import of any computer hardware that enabled general purpose computing. Now I would not be able to have any computer shipped to me from outside my country, so I could no longer buy from either of those vendors you linked.

Furthermore, it also would mean that I would not be able to bring any personal computers with me when I travel to other countries. I like to travel, and I like to bring my computers when I do.

Next, it would also be dangerous to try to buy computers locally within the borders of the country. The seller might be an informant of the police, or even a LEO doing a sting operation.

And then next you have to worry about the computers you already have. If you decide to keep the computers that you had since before, after it is made illegal to own them, you will have problems even if you keep them hidden and only use them at home. Other people know about your computers. Some of those people will definitely tip off the authorities about the fact that you are known to have computers.

Let’s hope it never goes as far like this :(


People would take the CPUs out of other devices and use them. A consumer grade router has most of the hardware you need to make a general purpose computer.


This is a slippery slope to the extreme.

What country outside of North Korea has banned the ownership of general purpose computers, or even considered/tried to?


Banning the import of personal computers would be absolutely disastrous for any possible economy anywhere.


That is virtually impossible because Turing-complete systems are everywhere


Just like how making weed illegal is virtually impossible because anybody can grow marijuana in their backyard.

How many regular people would risk owning turning-complete devices that can run unauthorized software if it would net you jail time if caught? Lots of countries are already itching towards banning VPN, corpo needs be damned.

Especially now that the iPhone has shown having a device that can only run approved legal software covers a lot of people's everyday needs.


I'm more referring to the fact that stuff like PowerPoint and Minecraft and who knows what are Turing-complete, albeit with awful performance.

Theoretically, you can have a totally owned device managed by Big Brother, yet generate AI smut with a general purpose CPU built in PowerPoint.

How do you possibly regulate that?


> How do you possibly regulate that?

The government could send an order to the software developer to patch out that turning completeness, and ban the software if it's not complied.

I get what you mean, it's never possible to 100% limit things. But if you limit things 98% so that the general public does not have access that's more than enough for authoritarian purposes.


I wonder if there's an analogy to be made here to DRM. In theory, yes, DRM shouldn't be possible, but in practice, manufacturers have been able to hobble hardware acceleration behind trusted computing model. Often, they do a poor job and it gets cracked (as with HDCP [1], and UWP [2]).

The question in my head is whether the failures in their approaches are due to a flaw in the implementation (in which case it's practically possible to do what they're trying to do although they haven't figured out a way to do it), or whether it's fundamentally impossible. With DRM and content, there's always the analog hole, and if you have physical control over the device, there's always a way to crack the software and the hardware if need be. My questions are whether:

a) this is a workable analogy (I think it's imperfect because Gen AI and DRM are kinda different beasts)

b) even if it was, is there real way to limit Gen AI at a hardware level (I think that's also hard because as long as you can do hardware accelerated matmul it's basically opening up the equivalent of the analog hole towards semi-turing completeness which is also hardware accelerated)

I imagine someone has thought through this more deeply than me and would be curious what they think.

[1] https://en.wikipedia.org/wiki/High-bandwidth_Digital_Content...

[2] https://techaeris.com/2018/02/18/microsoft-uwp-protection-cr...


Yeah I think it's fair to assume DRM will be a never-ending cat and mouse between developers and end-users.

Netflix for example can implement any DRM tech they want -- ultimately they're putting a picture on my screen, and it's impossible to stop me from extracting it.


Can you explain that context a little bit of Turing complete?


You can’t regulate the ownership of computing devices.

It’s too generic. There are too many of them.


They could ban and phase out systems with unsecure bootloaders. That would go a long way. Many vendors have already locked down their boot process.


So this should be it for trying to regulate theft, right? If you can open a window without any tool other than your own body. It doesn't seem like it's possible to stop thefts without a totalitarian surveillance state (and barely event then!).

Or same can be said about media "piracy". Or ransomwares.

States have forever regulated things that are not possible to enforce purely technically.


But theft is quite a different thing, is it not? It's a physical act that someone can be caught engaging in - be it by another person, a guard or a security camera. Sure, the "barrier for entry" to commit it is low, but retailers et al. are doing as much as they can to raise it.

Piracy most often isn't treated as a criminal matter, but a civil one - few countries punish piracy severely, but companies are allowed to sue the pirate.

I agree with OP in principle - regulating generative AI use would be way harder than piracy or whatever, especially since all of it can be done purely locally and millions of people already have the software downloaded. And that's not getting into the reasoning behind a ban - piracy and similar "digital crimes" are banned because they directly harm someone, while someone launching Stable Diffusion on their PC doesn't do much of anything.


> few countries punish piracy severely, but companies are allowed to sue the pirate.

UNCLOS, Part VII, Section 1, Article 100 https://www.un.org/depts/los/convention_agreements/texts/unc...

>> Duty to cooperate in the repression of piracy

>> All States shall cooperate to the fullest possible extent in the repression of piracy on the high seas or in any other place outside the jurisdiction of any State.

We could have just added "private computer" to the definition of piracy, and it largely would have applied.

>> Definition of piracy

>> Piracy consists of any of the following acts:

>> (a) any illegal acts of violence or detention, or any act of depredation, committed for private ends by the crew or the passengers of a private ship or a private aircraft, and directed [...] on the high seas, against another ship or aircraft, or against persons or property on board such ship or aircraft;


..What? Digital piracy has absolutely no logical or legal connections to naval piracy, except for sharing the same name.

No sane person could ever implement anything like this. This is like saying that we could "just" add the word "digital" to the laws prohibiting murder to make playing GTA illegal.


An extra-territorial crime

Mostly committed by private citizens in pursuit of profit

That all nations of the world have an interest in suppressing to encourage free trade that economically benefits them

But which some countries at various times have a geopolitical interest in supporting

... you're right, they have no logical or legal connections at all.


You could tie essentially any two crimes by assigning more broad descriptors to them that'd boil down to "this is what countries want to discourage". Not to mention, half of this is just wrong - digital piracy most often isn't extraterritorial (it very much falls under the jurisdiction of where the piracy took place), and most individuals pirate for personal needs, not profit.

The point stands - no jurisdiction that I know of treats digital piracy similarly to naval piracy, and there is no strong argument in favor of doing so.


> digital piracy most often isn't extraterritorial (it very much falls under the jurisdiction of where the piracy took place)

The canonical eBay/PayPal fraud from eastern Europe example?

> most individuals pirate for personal needs, not profit.

But most piracy is done by individuals in pursuit of profit, not for personal need.


no, this is a lousy analogy because there is a clear harm to others in the case of theft. we've tried regulating other difficult to regulate things where the harm is unclear or indirect (drugs being a good example) to no avail.

your piracy example is better. consider that it's the rise of more convenient options (netflix and spotify) not some effective policy that curtailed the prevalence of piracy.


> consider that it's the rise of more convenient options (netflix and spotify) not some effective policy that curtailed the prevalence of piracy.

The turning point was earlier than Netflix or Spotify – it was the iTunes Store. It was such a dramatic shift, people labelled Steve Jobs as “the man who persuaded the world to pay for content”.

https://www.theguardian.com/media/organgrinder/2011/aug/28/s...


Theft has a clearance rate of only 15%. Sounds like we already stopped trying to regulate most theft, in practice.


“Trying to regulate” and “succeeding in enforcing regulations” aren't the same thing.

In fact, a low clearance rate can be evidence of trying to regulate far beyond one's capacity to consistently enforce; if you weren't trying to regulate very hard, it would be much easier to have a high clearance rate for violations of what regulations you do have.


Yes, it is impossible to stop theft.


> If these models and their inference infra can be shrunk down to be runnable on a PS2, it doesn't seem like it's possible to stop this tech without a totalitarian surveillance state (and barely even then!).

The original requirement for these is 16GB of RAM, which can be had for less than $20. They run much faster on a GPU, which can be had for less than $200. Millions of ordinary people already have both of these things.


The PS2 only had 32 MB of ram. Even the PS3 only had 256 MB.

I know it was a bit of a funny hyperbolic example, but you'd need to shrink this down way further to run it on a PS2.


I thought most of the regulatory efforts were focused on training runs getting bigger and bigger rather than generation with existing models. Is there regulation you’re aware of around use of models?


Copyright infringement is quite cheap as well. Ease and illegality are tangential. You'd still stop commercial acts even if it's impossible to fully stop something.

That said, I don't think blanket regulation is all that likely anyhow.


What sort of regulations on the tech are you talking about? It really depends on what you are trying to do whether you can or not.


Not a surveillance state, but a stop on producing new, high performant chips should be enough.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: