1 min read24th Mar 20232 comments
This is a special post for quick takes by twkaiser. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.

New to LessWrong?

2 comments, sorted by Click to highlight new comments since: Today at 11:03 AM

How NOT to align AI #34.

What is humanity aligned to? Let’s hypothetically say humans are aligned by evolution for the following: “Your DNA is the most important substance in the universe; therefore maximize the amount of similar DNA in the universe”. Therefore, we align AGI to the following: “human (or similar) DNA is the most important substance in the universe; therefore maximize the amount of human or similar DNA in the universe.

Wait, I’m pretty sure there is already rule #34 on this, brb.