I'm an admin of LessWrong. Here are a few things about me.
Randomly: If you ever want to talk to me about anything you like for an hour, I am happy to be paid $1k for an hour of doing that.
Here is Eliezer Yudkowsky's recounting of what happened:
Epstein asked to call during a fundraiser. My notes say that I tried to explain AI alignment principles and difficulty to him (presumably in the same way I always would) and that he did not seem to be getting it very much. Others at MIRI say (I do not remember myself / have not myself checked the records) that Epstein then offered MIRI $300K; which made it worth MIRI's while to figure out whether Epstein was an actual bad guy versus random witchhunted guy, and ask if there was a reasonable path to accepting his donations causing harm; and the upshot was that MIRI decided not to take donations from him. I think/recall that it did not seem worthwhile to do a whole diligence thing about this Epstein guy before we knew whether he was offering significant funding in the first place, and then he did, and then MIRI people looked further, and then (I am told) MIRI turned him down.
Epstein threw money at quite a lot of scientists and I expect a majority of them did not have a clue. It's not standard practice among nonprofits to run diligence on donors, and in fact I don't think it should be. Diligence is costly in executive attention, it is relatively rare that a major donor is using your acceptance of donations to get social cover for an island-based extortion operation, and this kind of scrutiny is more efficiently centralized by having professional law enforcement do it than by distributing it across thousands of nonprofits.
In 2009, MIRI (then SIAI) was a fiscal sponsor for an open-source project (that is, we extended our nonprofit status to the project, so they could accept donations on a tax-exempt basis, having determined ourselves that their purpose was a charitable one related to our mission) and they got $50K from Epstein. Nobody at SIAI noticed the name, and since it wasn't a donation aimed at SIAI itself, we did not run major-donor relations about it.
This reply has not been approved by MIRI / carefully fact-checked, it is just off the top of my own head.
Here's Yudkowsky's memory of what happened:
Epstein asked to call during a fundraiser. My notes say that I tried to explain AI alignment principles and difficulty to him (presumably in the same way I always would) and that he did not seem to be getting it very much. Others at MIRI say (I do not remember myself / have not myself checked the records) that Epstein then offered MIRI $300K; which made it worth MIRI's while to figure out whether Epstein was an actual bad guy versus random witchhunted guy, and ask if there was a reasonable path to accepting his donations causing harm; and the upshot was that MIRI decided not to take donations from him. I think/recall that it did not seem worthwhile to do a whole diligence thing about this Epstein guy before we knew whether he was offering significant funding in the first place, and then he did, and then MIRI people looked further, and then (I am told) MIRI turned him down.
Epstein threw money at quite a lot of scientists and I expect a majority of them did not have a clue. It's not standard practice among nonprofits to run diligence on donors, and in fact I don't think it should be. Diligence is costly in executive attention, it is relatively rare that a major donor is using your acceptance of donations to get social cover for an island-based extortion operation, and this kind of scrutiny is more efficiently centralized by having professional law enforcement do it than by distributing it across thousands of nonprofits.
In 2009, MIRI (then SIAI) was a fiscal sponsor for an open-source project (that is, we extended our nonprofit status to the project, so they could accept donations on a tax-exempt basis, having determined ourselves that their purpose was a charitable one related to our mission) and they got $50K from Epstein. Nobody at SIAI noticed the name, and since it wasn't a donation aimed at SIAI itself, we did not run major-donor relations about it.
This reply has not been approved by MIRI / carefully fact-checked, it is just off the top of my own head.
I understand many in this world think guilt-by-association is valid, but that doesn't mean it is. Talking with a felon is not itself a crime (nor a bad thing) and you should generally not ostracize people for who they talk to.
Furthermore, accepting a donation from a felon is not inherently bad. Being paid off to launder their reputation is a bad thing, and insofar as you lend your reputation to them in exchange for money, that's unethical, but I think it's clear that it's not healthy for all felons to be barred from donating to charity/non-profits. The money is not itself tainted, it's their reputation that must be kept straight.
My guess is that it's a relatively common occurrence for Founders/CEOs to believe that their product is going to do wondrous things and take over the world, and that investors mostly see this as a positive.
Like, I don't think VCs are especially trying to be intellectuals, and don't mind much if people around them seem to believe inconsistent or incoherent things. I expect many founders around him believe many crazy things and he doesn't argue with them about it.
Edit: Seems I was explaining something that wasn't true! Points awarded to Eli's model that was confused.
My first guess is that Amodei simply treats the board meeting like a relatively standard for-profit company. Talks about revenue, growth, new features, new deals, etc.
Curated! A successful and well-written investigation into the causes of a recent and somewhat concerning event, gathering the information to figure out what's really happening behind a popular news story & easy narrative. I love having this example, and will hopefully remember to link to it when people ask me why I don't believe popular narratives about why things happened.
Yeah, on reflection I'd be willing to bet that the designer had an LLM look at some of the sites you mentioned and then asked it to imitate the style closely, as a core part of the web design.
Yeah I think that good web-design is a costly signal that you're invested in the quality of the writing, like in the past when someone published a book. However it is fake-able.
What do you mean by this? I didn't see a mention of web design in the OP, it seems largely about the content and the epistemic status of the content.
shrug I think sharing the info is good & fine. I think you have some responsibility for the hypotheses you privilege. Having an algorithm where you spam ppl with low-quality moral accusations just because a different social scene is generating them, is kind of attention-wasting.