That's a twist on a standard LW argument, see e.g. here:

Fragility of value is the thesis that losing even a small part of the rules that make up our values could lead to results that most of us would now consider as unacceptable

It seems to me that fragility of value can lead to massive suffering in many ways.

You're basically dialing that argument up to eleven. From "losing a small part could lead to unacceptable results" you are jumping to "losing any small part will lead to unimaginable hellscapes":

with a tall sharp peak (FAI) surrounded by a pit that's astronomically deeper

S-risks: Why they are the worst existential risks, and how to prevent them

by Kaj_Sotala 1 min read20th Jun 2017107 comments