tl;dr. Lens Academy is creating scalable superingelligence x-risk education with several USPs. Current team: Luc (full time founder, technical generalist) and several part time contributors. We have users and funding. Looking for a cofounder who's either a nontechnical generalist or a technical generalist. (Drafted by human; edited by AI, re-edited...
Most ideas fail. Successful startup founders don't get their first idea right. So they try things out, iterate relentlessly, and pivot when needed –[1] until they find something that works. In AI Safety, most ideas won't be impactful either. However, most people don’t know how to systematically move quickly and...
Your project might be failing without you even knowing it. It’s hard to save the world. If you’re launching a new AI Safety project, this sequence helps you avoid common pitfalls. Your most likely failure modes along the way: You never get started. Entrepreneurship is uncomfortable, and AI Safety is...
The number of people who deeply understand superintelligence risk is far too small. There's a growing pipeline of people entering AI Safety, but most of the available onboarding covers the field broadly, touching on many topics without going deep on the parts we think matter most. People come out having...