Yes, I’m not so sure either about the stockfish-pawns point.
In Michael Redmond’s AlphaGo vs AlphaGo series on YouTube, he often finds the winning AI carelessly loses points in the endgame. It might have a lead of 1.5 or 2.5 points, 20 moves before the game ends; but by the time the game ends, has played enough suboptimal moves to make itself win by 0.5 - the smallest possible margin.
It never causes itself to lose with these lazy moves; only reduces its margin of victory. Redmond theorizes, and I agree, that this is because the objective is to win, not maximize point differential, and at such a late stage of the game, its victory is certain regardless.
This is still a little strange - the suboptimal moves do not sacrifice points to reduce variance, so it’s not like it’s raising p(win). But it just doesn’t care either way; a win is a win.
There are Go AI that are trained with the objective of maximizing point difference. I am told they are quite vicious, in a way that AlphaGo isn’t. But the most famous Go AI in our timeline turned out to be the more chill variant.
Quip about souls feels unnecessary and somehow grates on me. Something about putting an athiesm zinger into the tag for cooking… feels off.
Would you be willing to share your ethnicity? Even as simple as “Asian / not Asian”?
I do think it has some of that feeling to me, yeah. I had to re-read the entire thing 3 or 4 times to understand what it meant. My best guesses as to why:
I felt whiplashed on transitions like “be motivated towards what's good and true. This is exactly what Marc Gafni is trying to do with Cosmo-Erotic Humanism”, since I don’t know him or that type of Humanism, but the sentence structure suggests to me that I am expected to know these. A possible rewrite could perhaps be “There are two projects I know of that aim to create a belief system that works with, instead of against, technology. The first is Marc Gafni; he calls his ‘Cosmo-Erotic Humanism’…”
There are some places I feel a colon would be better than a comma. Though I’m not sure how important these are, it would help slow down the pace of the writing:
“increasingly let go of faith in higher powers as a tenet of our central religion: secular humanism.” “But this is crumbling: the cold philosophy”
While minor punctuation differences like this are usually not too important, the way you wrote gives me a sense of, like, too much happening too fast: “wow, this is a ton of information delivered extremely quickly, and I don’t know what appolonian means, I don’t know who Gafni is, or what dataism is…” So maybe slowing down the pace with stronger punctuation like colons is more important than it would otherwise be?
Also, phrases like “our central religion is secular humanism” and “mystical true wise core” read as very Woo. I can see where both are coming from, but I’ve read a lot of Woo, but I think many readers would bounce off these phrases. They can still be communicated, but perhaps something like “in place of religion, many have turned to Secular Humanism. Secular humanism says that X, Y, Z, but has no concept of a higher power. That means the core motivation that…”
(To be honest I’ve forgotten what secular humanism is, so this was another phrase that added to my feeling of everything moving too fast, and me being lost).
There are some typos too.
So maybe I’d advise making the overall piece of writing slower, by giving more set-up each time you introduce a term readers are likely to be unfamiliar with. On the other hand, that’s a hassle, and probably annoying to do in every note, if you write on this topic often. But it’s the best I’ve got!
I read this book in 2020, and the way this post serves as a refresher and different look at it is great.
I think there might be some mistakes in the log-odds section?
The orcs example starts:
We now want to consider the hypothesis that we were attacked by orcs, the prior odds are 10:1
Then there is a 1/3 wall-destruction rate, so orcs should be more likely in the posterior, but the post says:
There were 20 destroyed walls and 37 intact walls… corresponding to 1:20 odds that the orcs did it.
We started at 10:1 (likely that it’s orcs?), then saw evidence suggesting orcs, and ended up with a posterior quite against orcs. Which doesn’t seem right. I was thinking maybe “10:1” for the prior should be “1:10”, but even then, going from 1:10 in the prior to 1:20 in the posterior, when orcs are evidenced, doesn’t work either.
All that said, I just woke up, so it’s possible I’m all wrong!
In Korea every convenience store sells “hangover preventative”, “hangover cure drink”, with pop idols on the label. Then you come back to America and the instant you say “hangover preventative”, people look at you crazy, like no such thing could possibly exist or help. I wonder how we got this way!
Thanks for your review! I've updated the post to make the medications warning be in italicized bold, in the third paragraph of the post, and included the nutrient warning more explicitly as well.
“(although itiots might still fall for the "I'm an idiot like you" persona such as Donald Trump, Tucker Carlson, and particularly Alex Jones).”
This line is too current-culture-war for LessWrong. I began to argue with it in this comment, before deleting what I wrote, and limiting myself to this.
The integral was incorrect! Fixed now, thanks! Also added the (f * g)(x) to the equality for those who find that notation better (I've just discovered that GPT-4o prefers it too). Cheers!