P(misalignment x-risk|AGI) is small #[Future Fund worldview prize]
The question (from FTX Future Fund) is what is $ > P(misalignment x-risk|AGI)’: Conditional on AGI being developed by 2070, humanity will go extinct or drastically curtail its future potential due to loss of control of AGI. I’m new to this community but I don’t think we should be computing...