It’s been a while so I’m a little hazy on the details, but in one of the Culture books by Iain M. Banks there’s a part where a bunch of Minds (for those unfamiliar: kind of beyond godlike artificial intelligences that run a utopian civilization, the eponymous Culture) are talking about how they can create simulations within simulations so perfect that it would be impossible to tell if you were in one, and what if their entire reality was just one in a long chain of nested, perfect simulations? But in the end they come to the conclusion that there’s no way to tell and nothing they can do about it anyway, so they might as well just get on with it lol.
It’s been a while so I’m a little hazy on the details, but in one of the Culture books by Iain M. Banks there’s a part where a bunch of Minds (for those unfamiliar: kind of beyond godlike artificial intelligences that run a utopian civilization, the eponymous Culture) are talking about how they can create simulations within simulations so perfect that it would be impossible to tell if you were in one, and what if their entire reality was just one in a long chain of nested, perfect simulations? But in the end they come to the conclusion that there’s no way to tell and nothing they can do about it anyway, so they might as well just get on with it lol.