this post was submitted on 05 Feb 2024
6 points (100.0% liked)

SimulationTheory

93 readers
1 users here now

A place for serious discussion of simulation theory.

Rules:

  1. No hate speech.
  2. Treat others with respect, no matter your agreement or disagreement.
  3. No low quality participation.
  4. Posts must clearly tie in with simulation theory or a submission statement must be added to explain the relevance to the topic.

founded 1 year ago
MODERATORS
 

What if the universe is simulated and special relativity is caused by the drop/lower FPS/TPS in regions with high amounts of mass/energy (perhaps to save on computation)?

You know how the time passes more slowly near a block hole? What if that's because the universe is updating/processing stuff slower in such regions compared to the emptier areas?

Let's imagine a universe has a framerate. What if that framerate drops significantly near the event horizon? For example, for each update/tick/frame there, many thousands or millions of frames happen in the rest of the universe. If you were near a black hole, you would still feel like the framerate is normal and it would seem like the rest of the universe is running at a much much faster framerate and stuff there would be happening super fast from your perspective.

Maybe the framerate drops so so so much near the singularity/event horizon that stuff that falls in stays still essentially from the perspective of the rest of the universe since framerate there asymptotically approaches zero and the whole thing grinds to a halt AKA the stuff never really reaches the singularity since it not getting updated/processed anymore (I mean, it is, but so rarely it would take a like an infinite amount of time for it to reach it).

This is obviously just my fun lil speculation that's probably wrong, but what do you guys think? Does it make sense and if it doesn't, why not?

you are viewing a single comment's thread
view the rest of the comments
[–] kromem 2 points 10 months ago* (last edited 10 months ago)

Yeah, it's a great thought in terms of time dilation at higher densities of mass (and thus information) as being like frame rate drops when there's a lot of objects loaded in.

The speed of light as the maximum speed of local information (gravitational waves travel at the same speed) is also similar to how games have maximum travel speeds because of memory throughput constraints (one of the big game design changes going from HDD to SSD for current gen consoles was potentially increased movement speeds).

Time dilation as you move more quickly goes hand in hand with this model too. As you move through the environment closer and closer to the speed limit for local information, your "frame rate" drops.

So moving too quickly or going near too dense of an area leads to managed frame rate drops - the rate remains steady (no spikes or abrupt drops) but decreases according to fixed rules.

It would be fun to see something like a Digital Foundry look at relativity partnering with a cosmological physicist, trying to estimate the maximum memory throughput and/or processing power of the universe based on how frame rate reduces as movement or density increases.