When I see the question, I know I am on LW. It allows me to deduce that "arcane runes" part is not important, but LLM don't have this context. Maybe it sounds like crackpot/astrology question to it?
e^3 is ~20, so for large n you get 95% of success by doing 3n attempts.
Thinking about responsible gambling, something like up-front long-term commitment should solve a lot of problems? You have to decide right away and lock up money you going to spend this month and that will separate decision from impulse to spend.
I tried to derive it, turned out to be easy: BC is wheel pair, CD is surface, slow medium above. AC/Vfast=AB/Vslow and for critical angle D touches small circle (inner wheel is on the verge of getting out of medium) so ACD is right triangle, so AC*sin (ACD)= AD (and AD same as AB) so sin(ACD) = AB/AC= Vslow/Vfast. Checking wiki it is the same angle (BC here is wavefront so velocity vector is normal to it). Honestly I am a bit surprised this analogy works so well.
I read about better analogy long time ago: use two wheels on an axle instead of single ball, then refraction come out naturally. Also I think instead of difference in friction it is better to use difference in elevation, so things slow down when they go to an area of higher elevation and speed back up going down.
It is defecting against cooperate-bot.
From ASI standpoint humans are type of rocks. Not capable of negotiating.
This experience-based primitivity also means inter-temporal self identification only goes one way. Since there is no access to subjective experience from the future, I cannot directly identify which/who would be my future self. I can only say which person is me in the past, as I have the memory of experiencing from its perspective.
While there is large difference in practice between recalling past event and anticipating future event, on conceptual level there is no meaningful difference. You don't have direct access to past events, memory is just an especially simple and reliable case of inference.
Would be funny if hurdle presented by tokenization is somehow responsible for LLM being smarter than expected :) Sounds exactly like kind of curveball reality likes to throw at us from time to time :)
I see some discussion about worth of bee vs human, and I feel there is big piece missing there. Why do we think that we are the only consciousness inhabiting our body? If bees are expected to be conscious why not subsystems of human brain? Possible a lot of them?