For more on Boltzmann brains, see here.
For more on Boltzmann brains, see here.
Yes, and also no.
That is, there are Boltzmann Brains that represent my current mental state, and there are also 'normal' universes containing 'normal' brains doing the same thing, and there are probably a bunch of other things too.
All of them are me.
No, because that's a meaningless claim about external reality. The only meaningful claims in this context are predictions.
"Do you expect to see chaos, or a well formed world like you recall seeing in the past, and why?"
The latter. Ultimately that gets grounded in Occam's razor and Solomonoff induction making the latter simpler.
I basically still endorse this, but have shifted even more in the direction of endorsing the simplicity prior: https://www.lesswrong.com/posts/yzrXFWTAwEWaA7yv5/boltzmann-brains-and-within-model-vs-between-models
This is a question similar to "am I a butterfly dreaming that I am a man?". Both statements are incompatible with any other empirical or logical belief, or with making any predictions about future experiences. Therefore, the questions and belief-propositions are in some sense meaningless. (I'm curious whether this is a theorem in some formalized belief structure.)
For example, there's an argument about B-brains that goes: simple fluctuations are vastly more likely than complex ones; therefore almost all B-brains that fluctuate into existence will exist for only a brief moment and will then chaotically dissolve in a kind of time-reverse of their fluctuating into existence.
Should a B-brain expect a chaotic dissolution in its near future? No, because its very concepts of physics and thermodynamics that cause it to make such predictions are themselves the results of random fluctuations. It remembers reading arguments and seeing evidence for Boltzmann's theorem of enthropy, but those memories are false, the result of random fluctuations.
So a B-brain shouldn't expect anything at all (conditioning on its own subjective probability of being a B-brain). That means a belief in being a B-brain isn't something that can be tied to other beliefs and questioned.
Because "thinking" is an ability that implies the ability to predict future states off the world based off of previous states of the world. This is only possible because the past is lower entropy than the future and both are well below the maximum possible entropy. A Boltzman brain (on average) arises in a maximally entropic thermal bath, so "thinking" isn't a meaningful activity a Boltzman brain can engage in.
Non Mathy answer:
Unlike the majority of LW readers, I don't buy into the MWI or Mathematical realism, or generally any exotic theory that allows for super-low-probability events. The universe was created by a higher power, has a beginning, middle and end, and the odds of a Boltzman brain arising in that universe are basically zero.
In addition to what DanArmak said:
Even if you, in the moment, do not have good reason to be confident that you are not a Boltzmann brain, you do have much better reason to believe that any entity you create in the future is not a Boltzmann brain.
If you wish to improve the accuracy of that entity's beliefs, you can do so by instilling that entity with a low prior of being a Boltzmann brain.
Among the entities you will create in the future is your own future self.
No, I don't. I think the argument for their existence is pretty weak at best, and if they exist and are common, so what? It's the sort of hypothesis for which no possible evidence can be given for or against and no action can be taken in any event.
Even given the (in my opinion pretty unlikely) hypotheses of their existence and ubiquity, what's the point of considering whether you're one of them? Such "observers", stretching the term to cover entities with essentially certain inability to form thoughts, lacking any sort of consistent memories, and hallucinating in their mean lifetime of less than a millisecond, can't do anything about it anyway.