I think you're right but I also think I can provide examples of "true" scaffolding skills:
This comment made me subscribe to your posts. I hope to read more on your attempts in the future! (no pressure)
I felt like I should share my uninformed two cents.
I also found this hard to parse. I suggest the following edit:
Omega will send you the following message whenever it is true: "Exactly one of the following statements is true: (1) you will not pull the lever (2) the stranger will not pull the lever " You receive the message. Do you pull the lever?
And even when the AGI does do work (The Doctor), it’s been given human-like emotions. People don’t want to read a story where the machines do all the work and the humans are just lounging around.
I am taking the opportunity to recommend the culture by Ian M. Banks here is a good entry point to the series, the books can be read in almost any order. It's not like they find no space for human-like actors, but I still think these books show -by being reasonably popular- that there is an audience for stories about civilizations where AGI does all the work.
Of course, your original point still stands if you say "most people" instead.
I think I found another typo
I have two theses. First of all, the Life Star is a tremendous
For anyone wondering TMI almost certainly stands for "The Mind Illuminated"; a book by John Yates, Matthew Immergut, and Jeremy Graves . Full title: The Mind Illuminated: A Complete Meditation Guide Integrating Buddhist Wisdom and Brain Science for Greater Mindfulness
Thank you
As I understand it, that point feels wrong to me. There are many things that I would be sad not to have in my life but only on the vaguely long term and that are easy to replace quickly. I have only one fridge and I would probably be somewhat miserable without one (or maybe I could adapt), but it would be absurd for me to buy a second one.
I would say most of the things that I would be sad to miss and that are easy to duplicate are also easy to replace quickly. The main exception is probably data, which should indeed be backed up regularly and safely.
Small nitpick: "the if and only if" is false. It is perfectly possible to have an AI that doesn't want any moral rights and is misaligned in some other way.