Posts

Sorted by New

Wiki Contributions

Comments

I have similar feelings, there's not a clear path for someone in an adjacent field. I chose my current role largely based on the expected QALYs, and I'd gladly move into AI Safety now for the same reason.

This post gives the impression that finding talent is not the current constraint, but I'm confused about why the listed salaries are so high for some of these roles if the pool is so large.

I've submitted applications to a few of these orgs, with cover letters that basically say "I'm here and willing if you need my skills". One frustration is recognizing Alignment as our greatest challenge, and not having a path to go work on it. Another is that the current labs look somewhat homogeneous and a lot like academia, which is not how I'd optimize for speed.

I once came home to finds ants carrying rainbow sprinkles across my apartment wall (left out from cake making). I thought it was entertaining once I understood what I was seeing.

There's a difference between "what would you do to blend apples" and "what would you do to unbox an AGI". It's not clear to me if it is just a difference of degree, or something deeper.