It is probably worth noting here that AI's ability to evaluate measure of matching your wish and consequences that you need is, in turn, limited by its own ability to evaluate consequences of its actions (if we apply the constraint that you are talking about to AI itself). That can easily turn into requirement of building a Maxwell's demon or AI admitting (huh..) that it is doing something about which it doesn't know if it will match your wish or not.
"The mechanisms which would usually kick in when you were doing something wrong will be engaged even when you're doing nothing wrong, and they will be in overdrive, taking apart everything in your life in order to find ways by which you are screwing up."
Sounds like a good observation to me. You can have problems in life that have NOTHING to do with your actions. For example, to be hit by the car and think that it is your responsibility that you underestimated driver's stupidness while optimizing some other, more important decision while walking across the street.
There's one question I'm thinking about: does feeling responsible for something (i.e. finding a way to do better job, logically forward and backward in time) always play a key role in such a depression, or not?