The light analogy sounds more like separate parts appearing like one mixed whole to me.
Above, I said For heaven’s sake, don’t use AI to help you write. I don’t share that sentiment for AI visual art and AI music. At this time, getting good results from them still requires human skill.
Couldn't one argue that getting good writing from an LLM would require skill as well?
clunky AI style will come through
The same is true of visual art -- you can usually tell when art is DALL-E / Midjourney "house style" (though not always). (Edit: my guess is you know this, but do not have the same reaction to LLM smell in text as to DALL-E smell in visual art. I think that's pretty interesting because my artistically inclined friends have a uniformly negative reaction to both.)
Also, I'm curious what factors you consider when judging someone's "performance in life".
It's an interesting exercise to take your last paragraph and switch the roles of love and ambition. I wonder if there's anything productive to be explored between those two perspectives, or if those are just terminal value differences.
most people either have or want children
What group of people is this claim supposed to refer to, LessWrong readers? The world population?
Motivated by getting real-world results ≠ motivated by the status and power that often accrue from real-world results. The interestingness of problems does not exist in a vacuum outside of their relevance. Even in theoretical research, I think problems that lead towards resolving a major conjecture are more interesting, which could be construed as a payoff-based motivation.
the opener in John Psmith's review of Reentry by Eric Berger: "My favorite ever piece of business advice comes from a review by Charles Haywood of a book by Daymond John..."
I found this nesting very funny. Bravo if it was intentional
In the "all positions" page, why is the second sentence of most summaries referring to a "detail" or "full description"? I see no way to access anything like that
It's worth highlighting that the two expectations do not condition on the same event. This explains why we can have E[A | all even] < E[B | even] even though A ≥ B almost surely: the two "all even"s actually refer to different events.
This sounds to me like a moderately positive score on the desire dimension (1st) and negative score on the capability dimension (2nd) of extroversion here.
I had such scores and my social situation is much better after I put some conscious effort towards resolving my social anxiety. (positive first, negative second is the only problematic pairing.)