Actually, I find this ending pretty nice. This AI does not seem to transform the universe into paperclips, it shows progress and values intelligence and as a bonus humans are still alive. A tyrant AI still seems to be human enough for me.
Or rather a derivative of the subjective progress with respect to time. In other words, effort is a derivative of progress.
Actually, I don't think that AI companions are going to have this specific flaw. If anything they will be too agreeable to what they think is your opinion. If the goal of the model is to provide a pleasant experience or a long conversation or something similar than changing someone's mind is along the worst things it can do. For example, ChatGPT often tries to identify your opinion on the specific topic and then argue in favour of it. I would expect radicalisation of society, because now everyone will be really convinced that his opinion is the best one. Only a small fraction of people that for some strange reason feels satisfied after changing its mind might actually move closer to the truth.
After reading this post, I would expect that the most probable explanation for one to wake up and see a tentacle instead of his hand will be a dream, in which such a specific tentacle is conjured because this individual fell asleep while reading this specific post
I just wanted to contribute by saying that in one of his lectures (I cannot remember the exact name), Feynman said that it is important for a physicist to know many interpretations that give the same predictions but are different computationally. The simplest example that comes to my mind is Newtonian and Lagrangian mechanics. I do not know which one is simpler in the technical sense of the word, but I am sure that every physicist is expected to know and understand both of them.
I guess we would eventually go there. Machines are going to become better friends/partners at some point, because humans are flawed. Not simply "flawed like anything real", but too flawed. They have limited time and emotional energy and so on. If the majority of humanity will spent their time with AI that is capable of making them happier/more motivated/more productive than humans would won't benefits overshadow the downsides?
P.S. There is a creepy feeling coming from the world of people disconnected from each other, but I think it comes from the fear of the unknown(or known, but really weird like dating a rock). But I may be simply too biased.