Gunnar_Zarncke

Software engineering, parenting, cognition, meditation, other
Linkedin, Facebook, Admonymous (anonymous feedback)

Wiki Contributions

Load More

Comments

Sorted by

I saw this in Xixidu's feed

“The information throughput of a human being is about 10 bits/s. In comparison, our sensory systems gather data at an enormous rate, no less than 1 gigabits/s. The stark contrast between these numbers remains unexplained.” https://arxiv.org/abs/2408.10234

The article has a lot of information about the information processing rate of humans. Worth reading. But I think the article is equating two different things:

  • The information processing capacity (of the brain; gigabits) is related to the complexity of the environment in which the species (here: the human) lives.
  • While what they call information throughput (~10bits/s) is really a behavior expression rate, that is related to the physical possibilities of the species (can't move faster than your motor system allows).

Using air purifiers in two Helsinki daycare centers reduced kids' sick days by about 30%, according to preliminary findings from the E3 Pandemic Response study. The research, led by Enni Sanmark from HUS Helsinki University Hospital, aims to see if air purification can also cut down on stomach ailments. https://yle.fi/a/74-20062381

See also tag Air Quality

I think you misunderstood what Questions on LW are for. They are not to solve your small problem but pose big or otherwise of wider interest issues. You could have asked your question as a Shortform without downvoting.

You may wonder why you were downvoted. I guess it is because this post feels like a knowledge-base fragment without context. LW is not a dictionary. Why is this relevant for the LW readers? What is your angle?

No? With normal probabilities, I can make bets and check my calibration. That's not possible here.

So far, we have a decisive argument against classical reference machine and Copenhagen.

My problem with this type of probabilistic proof is that it predictably leads to wrong results for some (maybe very few) observers. The simplest example is the Doomsday Argument: Assume everybody reasons like that, then very early/late people who apply this argument will arrive at the wrong conclusion that the end is near/far. 

I think Bostrom is aware of your point that you can't fully simulate the universe and addresses this concern by looking only at observable slices. Clearly, it is possible to solve some quantum equations - physicists do that all the time. It should be possible to simulate those fee observed or observable quantum effects.

The original Simulation Argument doesn't require a simulation at the atomic level:

The argument we shall present does not, however, depend on any very strong version of functionalism or computationalism. For example, we need not assume that the thesis of substrate-independence is necessarily true (either analytically or metaphysically) – just that, in fact, a computer running a suitable program would be conscious. Moreover, we need not assume that in order to create a mind on a computer it would be sufficient to program it in such a way that it behaves like a human in all situations, including passing the Turing test etc. We need only the weaker assumption that it would suffice for the generation of subjective experiences that the computational processes of a human brain are structurally replicated in suitably fine-grained detail, such as on the level of individual synapses. This attenuated version of substrate-independence is quite widely accepted.

Neurotransmitters, nerve growth factors, and other chemicals that are smaller than a synapse clearly play a role in human cognition and learning. The substrate-independence thesis is not that the effects of these chemicals are small or irrelevant, but rather that they affect subjective experience only via their direct or indirect influence on computational activities. For example, if there can be no difference in subjective experience without there also being a difference in synaptic discharges, then the requisite detail of simulation is at the synaptic level (or higher).

But more crucially:

Simulating the entire universe down to the quantum level is obviously infeasible, unless radically new physics is discovered. But in order to get a realistic simulation of human experience, much less is needed – only whatever is required to ensure that the simulated humans, interacting in normal human ways with their simulated environment, don’t notice any irregularities.

I do agree that we cannot perceive 3D thru the senses and have to infer the 3D structure and build a mental model of it. And a model composed mostly of surfaces is probably much more common.

Have you seen 3C's: A Recipe For Mathing Concepts? I think it has some definitions for you to look into, esp. the last sentence:

If you want to see more examples where we apply this methodology, check out the Tools post, the recent Corrigibility post, and (less explicitly) the Interoperable Semantics post. 

Load More