I agree that this seems like a likely effect.
It seems like the quality of short form writing that displaces what would otherwise have been full posts will generally be lower. But on the other hand, people might feel more willing to publish at all, because they don't have to make the assessment of whether or not they're good enough to be worth making a bid that other people read it.
FWIW, this was the basic take of CFAR and the milieu around CFAR at least as early as 2015, though there are additional operational details of how best to go about implementing this approach.
Yeah, the main reason is link rot.
Some months ago, I suggested that there could be an UI feature to automatically turn shortforms into proper posts if they get sufficent karma, that authors could turn on or off.
various weird obsessions like the idea of legalizing r*pe etc that might have alienated many women and other readers
Sidenote: I object to calling this a weird obsession. This was a minor-to-medium plot point in one science fiction story that he wrote, and (to my knowledge) has never advocated for or even discussed beyond the relevance to the story. I don't think that's an obsession.
- The early effective altruists would have run across these ideas and been persuaded by them, though somewhat more slowly?
I think I doubt this particular point. That EA embraced AI risk (to the extent that it did) seem to me like a fairly contingent historical fact due to LessWrong being one of the three original proto-communities of EA.
I think early EA could have grown into several very different scenes/movements/cultures/communities, in both from and content. That we would have broadly bought into AI risk as an important cause area doesn't seem overdetermined to me.
it's washing something which we don't yet understand and should not pretend to understand.
Washing? Like safetywashing?
I burst out laughing at this.
I'm glad what you're doing is working for you!?