Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Whenever I hear about weird shit from the realm of quantum physics (this theory, double-slit, etc..) I can't help but think:

Why would I hardcode values for imperceptible objects, that would take an enormous about of ram and cpu time to constantly update values in the off chance it's needed.

Much more efficient to optimize for what the _player_ can see at their perspective. Oh and I should probably code in some error handling in the fluke event one of these particles is detected, I'll just calculate their position retroactively, the user will never be able to tell and we can host way more players due to the reduced memory.

It would seem to me that god is a junior dev and no one reviewed his pull requests.



That model of efficiency (RAM and CPU time) is based on classical computation.

When your underlying model of computation is full quantum computation, it's simpler to just run everything at once. It takes no energy if you run everything without picking out a scenario (but in a kind of "tree falling in a forest" way), and more energy if you select out specific scenarios to look what happened (I/O is expensive). Counter-intuitively, the computation part of quantum systems is free in ways that we consider classical computers expensive to run. It's reversible and doesn't consume any energy.

That may seem like it's avoiding the point, after all what does it take to run the "underlying model of computation".

But what I'm trying to say is that "quantum all the way down" (see also turtles) is as much a valid model as "mechanical computation all the way down", which your picture relies on. Neither of them is more fundamental.

It may seem like quantum-all-the-way-down is a bit artificial, because we can in principle run quantum simulations on classical computers, which seem simpler. But it turns out we can't. There is a fundamental intractability barrier for simulations above an arbitrary tiny size, which means we can only simulate interesting quantum systems using other quantum systems. It really is quantum-all-the-way-down.

If god came up with the quantum-all-the-way-down version, I'd say that's pretty clever, because it's way more efficient than anything you would implement, with your old-school classical RAM and classical CPU.


One problem with this view is that “unobserved” particles still take up a large (possibly larger?) amount of computation. Rather than being in one position, it acts like a wave, being a little bit in every possible allowable position. These probability waves also interact with each other (which is what the double slit experiment demonstrates).

That being said if it was demonstrated that un collapsed wave functions are somehow more efficient to calculate that would definitely give credence to the simulation hypothesis.


> That being said if it was demonstrated that un collapsed wave functions are somehow more efficient to calculate that would definitely give credence to the simulation hypothesis.

Well said. I can certainly imagine a few functions that may prove to be more efficient at generating waves than fixed known positions + velocity for every subatomic particle in the universe.

Here's the part that really takes us off the rails, if we assume for a minute that we are in a simulation and that the parent world has godlike resources compared to our own and they likely have similar hardware concepts (ram, cpu, gpus, maybe even ASICS) then what functions are more efficient would depend on which hardware they have less of.

If ram is plentiful why not have fixed known values for every particle? Store it in memory and let the gpu detect collisions.

If gpus are plentiful (my guess) and we're bound by ram limitations, best to only store positions of things that are visible and clear the rest out of ram for more important calculations.

Imagine how inefficient it would be to render and simulate black holes colliding on the other side of the universe if the players will never even notice. Just queue up the function and run it on off-peak hours. Save the extra server resources for other simulations running in parallel.

If I'm right - a big if - we're likely an anomaly or early prototype among the simulations, one in which the dev team never imagined a race would evolve and progress enough to measure the bounds of their container. If I'm right again on this last point, we're likely being monitored closely to decide if subsequent patches require more resources to simulate completely and avoid player's realizing they're in a simulation or accept it as a remote possibility and move on.


if we're a simulation i somewhat doubt whoever made our simulation is really worried about "resources" and "efficiency". Whatever real universe that may exist may not even follow our laws of physics. Maybe energy is actually unlimited and free in the "real world". Maybe matter can be made from nothing effortlessly.


Why would someone build a simulation of a world utterly unlike their own?

When we build simulations for ourselves, they're always attempting to approximate reality as closely as possible. The goal is to learn useful things about our own world or society and to try out many forking paths, in a simplified representation of reality.

If we're in a simulation it stands to reason that whoever is running it is somewhat human-like, and exists in a world that has basically the same physical laws ... or at least, similar enough that sociological and technological development would be the same. For instance, the speed of light barrier is pretty damn inconvenient for us but would be great at blocking an arbitrarily large population and state space explosion. And why are these magic physical constants so arbitrary anyway?

If we are in a simulation, and our simulators did want to limit their resource consumption, adding in a few physical laws that are never really a problem in daily life and which block us from colonising the galaxy would be a nice way to do it.


Conversely, I would wonder why anyone would simulate something so similar to their own reality. We've already seen with humans that history repeats itself endlessly. Humans haven't really fundamentally changed in thousands of years. And humans do studies of the sociology of other species all the time. I think it'd be much more entertaining to simulate a world the laws are near opposite of our own. I would want to see what a species is capable of when restraints are lifted. To limit them to a single planet seems boring.

As an aside, I think one of my personal arguments that we're in a simulation is that we live in such an interesting time. We're beyond a world with 95% farmers. Technology is advancing faster than ever before. It's such a critical time in human history and the 20th century is personally where I'd choose to start a human simulation. It's convenient that this is our shared spot in time.


disagree on the junior dev part and on the god part.

what you’re describing is called “simulation theory” and it has been proposed and discussed at length

i think lazy evaluation makes sense in that context. i also think that having a few basic rules and after that applying them consistently across your simulation space make sense. if your basic space unit of reality is way smaller than the sims in it can perceive and measure they’re gonna start making stuff up


> disagree on the junior dev part and on the god part.

Good lord, some people here are definitely on the spectrum. FYI that was an attempt at humor. In case you want to mimic human social behavior in the future: you don't _disagree_ with a joke, you either find it funny or you don't.

> what you’re describing is called “simulation theory” and it has been proposed and discussed at length

I'm aware of the theory, just like everyone else whose seen the matrix or read Elon's Twitter feed. For the record, I was merely illustrating a point; both our universe and the code we write share certain optimizations that would seem to be too coincidental to be random.


> Good lord, some people here are definitely on the spectrum. FYI that was an attempt at humor. In case you want to mimic human social behavior in the future: you don't _disagree_ with a joke, you either find it funny or you don't.

Maybe he was disagreeing with your entire assesment and that your joke was neither funny or unfunny, but incorrect


> god is a junior dev and no one reviewed his pull requests.

> some people here are definitely on the spectrum. FYI that was an attempt at humor. In case you want to mimic human social behavior in the future ..

_Your_ failed attempt at humour doesn't justify labeling and gaslighting the person who didn't "get it", if anyone can ever consider that to be a joke.

> For the record, I was merely illustrating a point

Illustrating a point or making a joke? In either case, it doesn't justify your "human social behavior" of attacking the personality traits of the person disagreeing with you. My suggestion to you -in the future- accept that someone can have a different opinion and that you could be wrong and try to counteract that without resorting to labeling and gaslighting.


Being patronizing immediately after suggesting people here are on the autism spectrum is kind of insensitive to people who are autistic.

"God is a junior dev", is a mildly funny trope that has existed as long as I've been on the internet. I believe there's an xkcd about it where God says "we hacked most of it together with Perl".

Anyway, it's hard to tell what is and isn't a joke on the internet due to lack of vocal inflection. Maybe consider going easier on people when this happens in the future.


yes. people here are definitely on the spectrum. the spectrum of awesomeness that is.

i did find the idea of god being a senior dev/architect grumpy type being funnier then your overused tired meme


He did it all in 6 days though. That's less than a full sprint.


If you had infinite memory and energy such optimizations wouldn't be needed to begin with.


It could also be that our universe takes up so little resources on "gods" computer ;)


I always wondered about the "computability of the universe" and its compressibility.

If the universe is actually compressible, then object permanence may actually be a trick, like it is in video games. Objects are generated on demand and deleted to save space and processing.

If it's not compressible, which is where I lean, then the full universe must be fully computed every time. No savings can be made, and any delay in computing an object might propagate and cause recalculation cascades for other objects. I lean this way because from my layman's understanding, quantum mechanics act similarly to random seeds, which increases entropy by a lot.

If it's not compressible, then the smallest computer able to simulated the universe is at least as big as the universe itself.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: