LESSWRONG
LW

71
Austin Chen
2094261500
Message
Dialogue
Subscribe

Hey there~ I'm Austin, currently building https://manifund.org. Always happy to meet LessWrong people; reach out at akrolsmir@gmail.com!

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
4Austin Chen's Shortform
3y
7
No wikitag contributions to display.
The Inkhaven Residency
Austin Chen1mo184

This looks awesome, congrats on announcing this! I would be extremely tempted myself were it not for a bunch of other likely obligations. Approximately how large do you expect this fellowship to be?

Also, structuring Inkhaven as a paid program was interesting; most fellowships (eg Asterisk, FLF, MATS) instead pay their participants. I wonder if this introduces minor adverse selection, in that only writers who are otherwise financially stable can afford to participate. Famously, startup incubators that charge (like OnDeck) are much worse than incubators that pay for equity (like YC or hf0).

I imagine you've thought about this a lot already, and you do offer need-based scholarships which is great; also things like LessOnline and Manifest have proven some amount of success for charging for events. But maybe there's some other way of finding sponsors or funders for these writers? For example, I think Manifund would be quite happy to sponsor 1-3 "full rides" at $5k+ each, for a few bloggers who are interested in topics like AI safety funding, impact evaluations, and new opportunities, which we could occasionally crosspost to the Manifund newsletter. And I imagine other orgs like GGI might be too!

Reply1
Measuring the Impact of Early-2025 AI on Experienced Open-Source Developer Productivity
Austin Chen2mo101

I agree with the paper that paying here probably has minimal effects on devs, but also even if it does have an effect it doesn't seem likely to change the results, unless somehow the AI group was more more incentivized to be slow than the non AI group. 

Reply1
Are Intelligent Agents More Ethical?
Austin Chen3mo20

Minor point of clarity: I briefly attended a talk/debate where Nate Soares and Scott Aaronson (not Sumner) was discussing these topics. Are we thinking of the same event, or was there a separate conversation with Nate Soares and Scott Sumner?

Reply
New Endorsements for “If Anyone Builds It, Everyone Dies”
Austin Chen3mo172

If you're looking to do an event in San Francisco, lmk, we'd love to host one at Mox! 

Reply
Information-Dense Conference Badges
Austin Chen3mo60

Thanks Ozzie - we didn't invest that much effort into badges this year but I totally agree there's an opportunity to do something better. Organizer-wise it can be hard to line up all the required info before printing, but having a few sections where people can sharpie things in or pick stickers, seems like low hanging fruit. 

This could also extend beyond badges - for example, one could pick different colored swag t-shirts to signal eg (academia vs lab vs funder) at a conference. 

I'll also send this to Rachel for the Curve, which I expect she might enjoy this as a visual and event design challenge. 

Reply
Corbent – A Master Plan for Next‑Generation Direct Air Capture
Austin Chen4mo60

Huh, seems pretty cool and big-if-true. Is there a specific reason you're posting this now? Eg asking people for feedback on the plan? Seeking additional funders for your $25m Series A? 

Reply
Please Donate to CAIP (Post 1 of 7 on AI Governance)
Austin Chen4mo50

My guess btw is that some donors like Michael have money parked in a DAF, and thus require a c3 sponsor like Manifund to facilitate that donation - until your own c3 status arrives, ofc. 

(If that continues to get held up. but you receive an important c3 donation commitment in the meantime, let us know and we might be able to help - I think it's possible to recharacterize same year donations after c3 status arrives, which could unblock the c4 donation cap?) 

Reply
Please Donate to CAIP (Post 1 of 7 on AI Governance)
Austin Chen4mo80

From the Manifund side: we hadn't spoken with CAIP previously but we're generally happy to facilitate grants to them, either for their specific project or as general support. 

A complicating factor is that, like many 501c3s, we have a limited budget to be able to send towards c4s, eg I'm not sure if we could support their maximum ask of $400k on Manifund. I do feel happy to commit at least $50k of our "c4 budget" (which is their min ask) if they do raise that much through Manifund; beyond that, we should chat!

Reply
Which journalists would you give quotes to? [one journalist per comment, agree vote for trustworthy]
Answer by Austin ChenMay 07, 202559

Kevin Roose

Reply
Austin Chen on Winning, Risk-Taking, and FTX
Austin Chen5mo5-2

Thanks to Elizabeth for hosting me! I really enjoyed this conversation; "winning" is a concept that seems important and undervalued among rationalists, and I'm glad to have had the time to throw ideas around here.

I do feel like this podcast focused a bit more on some of the weirder or more controversial choices I made, which is totally fine; but if I were properly stating the case for "what is important about winning" from scratch, I'd instead pull examples like how YCombinator won, or how EA has been winning relative to rationality in recruiting smart young folks. AppliedDivinityStudies's "where are all the successful rationalists" is also great.

Very happy to answer questions ofc!

Reply11
Load More
9Questions about animal welfare markets
2mo
0
21Manifund 2025 Regrants
5mo
0
27Fundraising for Mox: coworking & events in SF
5mo
0
77AI for Epistemics Hackathon
6mo
12
855 homegrown EA projects, seeking small donors
11mo
4
71If far-UV is so great, why isn't it everywhere?
11mo
23
58Announcing the $200k EA Community Choice
1y
8
94What are you getting paid in?
1y
14
20Podcast: Elizabeth & Austin on "What Manifold was allowed to do"
1y
0
20Episode: Austin vs Linch on OpenAI
1y
25
Load More