Mati Roy


Sorted by New

Wiki Contributions


Alignment seems like a really hard problem, and having smarter humans could be really important.

AGI doesn't seem to need as much intelligence -- just scale what we already have, and experiments with various tweaks.

We could have highly enhanced adult humans in 30 years (although promising projects/teams are currently significantly underfunded).

This too requires us to buy serial time. Because you can't make a baby in one month by having 9 people pregnant.