Applied to our world[1], it implies that the entities that are simulating us have access to a more powerful form of computation than quantum computers. Quantum mechanics is thought to be extremely hard to simulate on a classical computer, which means that either our simulators have access to powerful quantum computers, or an even more powerful form of computation.
[1] specifically, quantum mechanics and BQP != BPP, which could be overturned by a better theory of physics or a breakthrough in complexity theory, respectively.
I don't think this answer is in any way related to my question.
This is my fault, because I didn't explain what I exactly mean by the "simulation", and the meaning is different than the most popular one. Details in EDIT in the main post.
Let's say you decided a best way to achieve some goal is to create a simulation. You'll almost certainly have to balance:
Note that this is true regardless of the purpose and complexity of the simulation - be it weather modelling, or a civ-like computer game.
Consider a following assumption:
We live in a simulation created by some entities who had the dilemma described above and they decided perfect accuracy is not necessary.
This gives us a sort-of-an-answer to the question "what is behind the laws of physics?": they are approximations of some other laws that would be harder to compute. Therefore, maybe we could try to devise:
and observe how they together lead to the laws of physics we see?
I know and understand how speculative this is. Nevertheless, I find this really interesting. So, the question: is there any literature approaching the simulation hypothesis from this direction? Anything I could read?
EDIT: The meaning of the "simulation" in this context differs - I think - from the default. In this POV, the world we live in is the "hardware" built with a purpose of simulating something even more complex. In other words, there is no question e.g. "how they simulate our quantum mechanics" but rather "why implement quantum mechanics the way they did".