Suppose astronomers detect an asteroid, and suggest a 10% chance of it hitting the Earth on a near-pass in 2082. Would you regard this assessment of risk as optimistic, or pessimistic? How many resources would you dedicate to solving the problem?
My understanding is that 10% isn't actually that far removed from what many people who are deeply concerned about AI think (or, for that matter, people who aren't that concerned about AI think - it's quite remarkable how differently people can see that 10%); they just happen to think that a 10% chance of total extinction is a pretty bleak thing, and ought to get our full attention. Indeed, I'd bet there's somebody around here who is deeply concerned about AI risk who assesses the risk as 1%. Remember that that risk of total human annihilation is greater than the risk of COVID to any one individual, and our society suffered massive upheaval to limit the risks there.
Which is to say - I don't think FHI or Toby Ord are significantly more optimistic than people who are deeply concerned about AI risk.