If we all exist in a simulation, what will happen once we start running out of RAM?

https://lemm.ee/post/24239819

If we all exist in a simulation, what will happen once we start running out of RAM? - lemm.ee

Assuming our simulation is not designed to auto-scale (and our Admins don’t know how to download more RAM), what kind of side effects could we see in the world if the underlying system hosting our simulation began running out of resources?

We go to sleep and it clears
Great, thanks for the dose of existential dread.
The OOM killer goes on the prowl.
An automatic purge process will start to prevent this. It happened several times in the past. Last time between 2019-2022. It removed circa 7 million processes. With regular purges like this it is made sure that the resources are not maxed out before the admins can add more capacity.
We can see that already when something approaches the speed of light: time slows down for it.
This simplification horribly misunderstands what time-dilation is, and I love it.
My vm is running out of ram.
I have a running theory that that’s also what’s going on with quantum physics, because I understand it so poorly that it just seems like nonsense to me. So in my head, I see it as us getting into some sort of source code we’re not supposed to see, and on the other side some programmers are going “fuck I don’t know, just make it be both things at once!” and making it up on the fly.
Limitations of hardware resources show up as “Natural Limits”, like the speed of light, in the simulation. The amount of RAM consumed translates to the Hubble Bubble, or the greatest distance light could have traveled since the beginning of our universe, and moreso to the amount of matter and energy contained within it, which is a constant. Energy and matter cannot be created or destroyed, only changed forms allowed, so a set amount from the beginning.
You won’t notice anything. As things are deleted to save on memory all references to them are removed as well.
The universe starts swapping
You should check out the short story “Sleepover” by Alastair Reynolds.
Who knows… maybe we’ll experience pointless wars and massive inequality… selfish douchebags who only care about bolstering their ego might gain power… heck, maybe even the climate will slowly start changing for the worse.

So… the ultra-rich are just poorly programmed process with memory leaks. And there’s no runaway process killer to protect the system.

God is just a hack scripter; it makes sense.

I don’t necessarily believe this, but I’ll play along.

To make it appear natural so we don’t notice, death is the first thing that comes to mind. So pandemics, disasters and wars that kill off beings on a large scale to free up memory. A globe with limited surface area seems ideal to stick us on to begin with, with anything outside of that sphere virtually impossible to access. The size of Earth could have been chosen because it fits comfortably within the RAM limits. If Earth is pushing the RAM limits, each planet could be hosted on its own server. So if we someday colonized Mars or the moon, the trip between would be like a server transfer making the RAM issues for interplanetary colonization inconsequential.

If you want to really explore the fringes of this concept, maybe those in the simulation would see glitches that shouldn’t happen if it starts running out of RAM. UFOs, shadows, or synchronicities could become commonplace. People could randomly go catatonic or experience amnesia if they’re personally impacted. If it got out of control across the entire simulation, perhaps a hard reset would become necessary. It may even be a planned cycle of hard resets based on the anticipated maximum lifespan of the simulation before things start to get fucky due to memory errors. So power on = big bang, and hard reset something like big crunch or heat death of the universe.

Wars would use too much ram do Pandemic would make.more sense
There would either be some kind of mass extinction event or something that would be considered “supernatural” would occur to maintain the status quo
I did not expect the responses to this question to be as interesting to read as they are 😃
The server admins run a kill -9 on a few processes. Inside the sim, this looks a lot like the Chicxulub impact.
Chicxulub crater - Wikipedia

We would probably see more caching of parts of the universe that don’t typically observe. Given that our current observation can’t see this in current time, we don’t immediately notice.

The interesting bit would be to figure out what parts get cached, since we may not be the only sentient life.

Maybe the system would be configured with some odd laws that constantly shrink the size of the observable universe?
Couldn’t they just suspend the simulation until they got more resources? We wouldn’t notice a thing.
I believe you are thinking in terms of a Turing-machine-like computer. I don’t think it’s possible today to “suspend” the bits in a quantum computer. I don’t think it’s possible to know if it could be paused (or even “added to” without losing its initial state).
Render distance would be reduced requiring us to come up with plausible theories to account for the fact that there is a limit to the size of the so-called ‘observable universe’
I’m more concerned with what happens when the hardware invariably fails…
The universe ends when little Timmy gets sent to bed for the night.
adjacent answer, but resource requirements are lower than might be expected since the simulation only needs to capture elements observed by a conscious entity. the vast majority of the known universe has not been observed in any detail that requires significant memory or processing resources. this same technique is employed by computer game designers so that only scenery and elements within view of a player are fully rendered.
Human music. Huh. I like it!
you know, they’re made out of meat?
Why would we run out of RAM? Is there new matter being created? It’s not like we’re storing anything. We will keep using the same resources.
The nature of quantum interactions being probabilistic could be some resource saving mechanism in a higher order simulation.

New human instances are being created, and as our society’s general education keeps going up, they demand more processing power.

As our tech goes up, this has to be simulated as well. Not only things like telescopes and the LHC, but your computer who’s running a game world doesn’t actually exists and it’s the super computer who’s running it.

Obviously, this is just a drop in the bucket for an entity that can make a fully simulated universe but the situation quickly becomes untenable if we start creating hyper advanced simulation as well, we are maybe only a few decades away.

Human instances still run on the same underlying physics. No further RAM is needed.

As our tech goes up, this has to be simulated as well

Everything is made up of atoms/photons/etc. If every particle is tracked for all interactions, it doesn’t matter how those particles are arranged, it’s always the same memory.

Atoms and photons wouldn’t actually exist, they would be generated whenever we measure things at that level.

Obviously, there’s many ways to interpret what kind of simulation it would be. A full simulation from the big band is fun but doesn’t make for good conversation since it would be indistinguishable from reality.

I was thinking more of a video game like simulation, where the sim doesn’t render things it doesn’t need to.

where the sim doesn’t render things it doesn’t need to.

That can’t work unless it’s a simulation made personally for you.

I don’t follow. If there are others it would render for them just as much as me. I’m saying it wouldn’t need to render at an automic level except for the few that are actively measuring at that level.
Everything interacting is “measuring” at that level. If the quantum levels weren’t being calculated correctly all the time for you, the LEDs in your smartphone would flicker. All those microscopic effects cause the macroscopic effects we observe.

If it was a simulation, there would be no need to go that far. We simulate physics without simulating the individual atoms.

None of it would be real, the microscopic effects would just be approximated unless a precis measurement tool with be used and then they would be properly simulated.

We wouldn’t know the difference.

If it was a simulation, there would be no need to go that far

But you already said you have to go that far whenever someone is doing something where they could notice microscopic effects.

So it’s not a simulation as much as a mind reading AI that continuously reads every sentient mind in the entire universe so as to know whether they are doing a microscopic observation that needs the fine grained resolution result or an approximation can be returned.

There would be no need to go that far at all times is what I’m saying. It’s the equivalent of a game rendering stuff far away only when you use a scope. Why render everything at all times if it isn’t being used and does not affect the experience. It would augment the overhead by an insane amount.

This is also just a thought exercise.

Why render everything at all times if it isn’t being used and does not affect the experience.

But how does the simulation software know when it needs to calculate that detail? If you are the only person in the simulation, it’s obvious because everything is rendered from your perspective. But if it’s more than one person in the universe, an ai program has to look at the state of the mind of everyone in the universe to make sure they aren’t doing something where they could perceive the difference.

Am I microwaving a glass of water to make tea, or an I curious about that YouTube video where I saw how you can use a microwave to measure the speed of light. Did I just get distracted and didn’t follow through with the measurement? Only something constantly monitoring my thoughts can know. And it has to be doing it for everyone everywhere in the entire universe.

The way I see it, it would be coupled with the tool and not the intention someone has with it. So every microwave would render it properly at all time, as well as most electronics just by their very nature, regardless of what the person plans to do with it.

Actually I think they can probably just approximate the microwave stuff and just keep the electrical tools rendering like oscilloscopes.

They only need to render for things that give an exact measurement, the microwave trick has a 3% tolerance which is huge in the scope of things.

It seems like a lot but it’s less than simulating every single atom imo.

It’s more than electronics. Every piece of diffraction grating could be used to make a wave interference measurement. Every fiber optic line in the world- because bend it too much and the wave doesn’t stay bound inside.

But that still doesn’t get rid of the AI part because you need something watching to know when an electronic device is created by anyone everywhere in the universe and understand that that device is a type of device that could be used to reveal detailed measurements.

Without knowing the nature of the simulation, we don't even know if there is an analogue for RAM or limited memory. Maybe you could walk in and out a door repeatedly and then glitch into a locked room. Maybe the whole thing would crash - our programs tend to do this when memory runs out. Maybe everything would just get paused or "adjusted down" to fit the restriction. The crash, pause or throttle wouldn't be apparent to us "on the inside" at all if it were happening.

One word

Alzheimers

Why do you think our admins wouldn’t use autoscale, when they’ve obviously built it into the simulation?
Starts with erectile dysfunction and ends with the little blue pill…
That’s why history repeats itself. It’s doing that more frequently these days because there’s more people remembering more things.
They take some users offline to free up some memory for everyone else
Have you not played Dwarf Fortress? Frame rate goes way down, a situation imperceptible to the dorfs. Then eventually the operator of the machine looses interest, or a oandemic makes the pop count drop, or a combo of those.
Data in memory will be offloaded to swap space. I doubt we’d notice any fluctuations since we’re part of the simulation, but externally it could slow to a crawl and basically be useless. They might shut it down, hopefully just to refactor. But again we probably wouldn’t notice any downtime, even if it’s permanent.
That would be the most pleasant way to go :)

Not sure you've experienced the end of many SimCity games if you think this is the case. 😂

If anything, the earth lately kinda feels like someone's gotten bored with the game.

12 meteors, 8 volcanoes and 10 tornadoes incoming you say?
A landscape full of Arcos and waves of boom and bust?
iirc this is a plot point in the book “Fall; or, Dodge in Hell” by Neal Stephenson (sequel to Reamde). At some point the virtual world slows to a crawl so much that people outside of it cannot really track what is going on but it’s transparent to those inside the world. I might be misremembering exactly how it was implemented.
once we run out of ram then the universe starts running on Skyrim physics