Wikitag Contributions

Comments

Sorted by
Valdes20

Small nitpick: "the if and only if" is false. It is perfectly possible to have an AI that doesn't want any moral rights and is misaligned in some other way.

Valdes20

I think you're right but I also think I can provide examples of "true" scaffolding skills:

  1. How to pass an exam: in order to keep learning with the academic system/university/school you need to regularly do good enough at exams. That is a skill in itself (read the exam in its entirety, know when to move on, learn how hard a question is likely to be depending on the phrasing of the following questions, ...) Almost everyone safely forget most of this skill once they are done studying.
  2. Learn to understand your teacher's feedback: many teachers, professional or otherwise, suck at communicating their feedback. You often need to develop a skill of understanding that specific individual's feedback. Of course there is a underlying universal skill of "being good at learning how individuals give feedback"; we could think of it as the skill "being good at building a specific kind of scaffolding".
  3. Learn to accept humiliating defeat: A martial artist friend told me it is important at first to learn to accept losing all the time because you learn in the company of strictly better martial artists. Once you get better, you presumably lose less often.
Valdes10

This comment made me subscribe to your posts. I hope to read more on your attempts in the future! (no pressure)

Valdes10

I felt like I should share my uninformed two cents.

  1. Interpol seems like a promising lead, if you can get the right person at Interpol to understand the situation. I am not saying this is easy, but maybe you can get an email to be sent on your behalf on the right mailing lists (alumnis of some relevant school maybe?).
  2. Other comments suggested getting funding from EA and that sounds fitting to me. But there is probably someone in EA that can connect you with Interpol directly. Maybe you can request to send a broad email on top of requesting funding.
Valdes10

I also found this hard to parse. I suggest the following edit:

Omega will send you the following message whenever it is true: "Exactly one of the following statements is true: (1) you will not pull the lever (2) the stranger will not pull the lever " You receive the message. Do you pull the lever?

Valdes10

And even when the AGI does do work (The Doctor), it’s been given human-like emotions. People don’t want to read a story where the machines do all the work and the humans are just lounging around.

I am taking the opportunity to recommend the culture by Ian M. Banks here is a good entry point to the series, the books can be read in almost any order. It's not like they find no space for human-like actors, but I still think these books show -by being reasonably popular- that there is an audience for stories about civilizations where AGI does all the work.

Of course, your original point still stands if you say "most people" instead.

Valdes00

I think I found another typo

I have two theses. First of all, the Life Star is a tremendous

Valdes50

For anyone wondering TMI almost certainly stands for "The Mind Illuminated"; a book by John Yates, Matthew Immergut, and Jeremy Graves . Full title: The Mind Illuminated: A Complete Meditation Guide Integrating Buddhist Wisdom and Brain Science for Greater Mindfulness

Valdes20

As I understand it, that point feels wrong to me. There are many things that I would be sad not to have in my life but only on the vaguely long term and that are easy to replace quickly. I have only one fridge and I would probably be somewhat miserable without one (or maybe I could adapt), but it would be absurd for me to buy a second one.

I would say most of the things that I would be sad to miss and that are easy to duplicate are also easy to replace quickly. The main exception is probably data, which should indeed be backed up regularly and safely.

Load More