Program Den

A jester unemployed is nobody's fool.

Wiki Contributions

Comments

I would probably define AGI first, just because, and I'm not sure about the idea that we are "competing" with automation (which is still just a tool conceptually right?).

We cannot compete with a hammer, or a printing press, or a search engine.  Oof.  How to express this?  Language is so difficult to formulate sometimes.

If you think of AI as a child, it is uncontrollable.  If you think of AI as a tool, of course it can be controlled.  I think a corp has to be led by people, so that "machine" wouldn't be autonomous per se…

Guess it's all about defining that "A" (maybe we use "S" for synthetic or "S" for silicon?)

Well and I guess defining that "I".

Dang.  This is for sure the best place to start.  Everyone needs to be as certain as possible (heh) they are talking about the same things.  AI itself as a concept is like, a mess.  Maybe we use ML and whatnot instead even?  Get real specific as to the type y todo?

I dunno but I enjoyed this piece!  I am left wondering, what if we prove AGI is uncontrollable but not that it is possible to create?  Is "uncontrollable" enough justification to not even try, and moreso, to somehow [personally I think this impossible, but] dissuade people from writing better programs?

I'm more afraid of humans and censorship and autonomous policing and whathaveyou than "AGI" (or ASI)

Yes, it is, because it took like five years to understand minority-carrier injection.

LOL!  Gesturing in a vague direction is fine.  And I get it.  My kind of rationality is for sure in the minority here, I knew it wouldn't be getting updoots.  Wasn't sure that was required or whatnot, but I see that it is.  Which is fine.  Content moderation separates the wheat from the chaff and the public interwebs from personal blogs or whatnot.

I'm a nitpicker too, sometimes, so it would be neat to suss out further why the not new idea that “everything in some way connects to everything else" is "false" or technically incorrect, as it were, but I probably didn't express what I meant well (really, it's not a new idea, maybe as old as questions about trees falling in forests— and about as provable I guess).

Heh, I didn't even really know I was debating, I reckon.  Just kind of thinking, I was thinking. Thus the questioning ideas or whatnot… but it's in the title, kinda, right?  Or at least less wrong? Ha!  Regardless, thanks for the gesture(s), and no worries!

I love it!  Kind of like Gödel numbers!

I think we're sorta saying the same thing, right?

Like, you'd need to be "outside" the box to verify these things, correct?

So we can imagine potential connections (I can imagine a tree falling, and making sound, as it were) but unless there is some type of real reference— say the the realities intersect, or there's a higher dimension, or we see light/feel gravity or what have you— they don't exist from "inside", no?

Even imagining things connects or references them to some extent… that's what I meant about unknown unknowns (if I didn't edit that bit out)… even if that does go to extremes.

Does this reasoning make sense?  I know defining existence is pretty abstract, to say the least. :)

My point is that complexity, no matter how objective a concept, is relative.  Things we thought were "hard" or "complex" before, turn out to not be so much, now.

Still with me?  Agree, disagree?

Patterns are a way of managing complexity, sorta, so perhaps if we see some patterns that work to ensure "human alignment[1]", they will also work for "AI alignment" (tho mostly I think there is a wide wide berth betwixt the two, and the later can only exist after of the former).

We like to think we're so much smarter than the humans that came before us, and that things — society, relationships, technology — are so much more complicated than they were before, but I believe a lot of that is just perception and bias.

If we do get to AGI and ASI, it's going to be pretty dang cool to have a different perspective on it, and I for one do not fear the future.

  1. ^

    assuming alignment is possible— "how strong of a consensus is needed?" etc.

For something to "exist", it must relate, somehow, to something else, right?

If so, everything relates to everything else by extension, and to some degree, thus "it's all relative".

Some folk on LW have said I should fear Evil AI more than Rogue Space Rock Collisions, and yet, we keep having near misses with these rocks that "came out of nowhere".

I'm more afraid of humans humaning, than of sentient computers humaning.

Is not the biggest challenge we face the same as it has been— namely spreading ourselves across multiple rocks and other places in space, so all our eggs aren't on a single rock, as it were?

I don't know.  I think so.  But I also think we should do things in as much as a group as possible, and with as much free will as possible.

If I persuade someone, did I usurp their free will?  There's strength in numbers, generally, so the more people you persuade, the more people you persuade, so to speak.  Which is kind of frightening.

What if the "bigger" danger is the Evil AI?  Or Climate Change?  Or Biological Warfare?  Global Nuclear Warfare would be bad too.  Is it our duty to try to organize our fellow existence-sharers, and align them with working towards idea X?  Is there a Root Idea that might make tackling All of the Above™ easier?

Is trying to avoid leadership a cop-out?  Are the ideas of free will, and group alignment, at odds with each other?

Why not just kick back and enjoy the show?  See where things go?  Because as long as we exist, we somehow, inescapably, relate?  How responsible is the individual, really, in the grand scheme of things?  And is "short" a relative concept?  Why is my form so haphazard?  Can I stop this here[1]?

  1. ^

    lol[2], maybe the real challenge, and Key Root Idea®, relates to self control and teamwork…

  2. ^

    At least I crack me up. :) "not it!" FIN

It's a weird one to think about, and perhaps paradoxicle.  Order and chaos are flip sides of the same coin— with some amorphous 3rd as the infinitely varied combinations of the two!

The new patterns are made from the old patterns.  How hard is it to create something totally new, when it must be created from existing matter, or existing energy, or existing thoughts?  It must relate, somehow, or else it doesn't "exist"[1].  That relation ties it down, and by tying it down, gives it form.

For instance, some folk are mad at computer-assisted image creation, similar to how some folk were mad at computer-aided music.  "A Real Artist does X— these people just push some buttons!" "This is stealing jobs from Real Artists!" "This automation will destroy the economy!"

We go through what seem to be almost the same patterns, time and again:  Recording will ruin performances.  Radio broadcasts will ruin recording and the economy.  Pictures will ruin portraits.  Video will ruin pictures.  Music Video will run radio and pictures.  Or whatever.  There's the looms/Luddites, and perhaps in ancient China the Shang were like "down with the printing press!" [2]

I'm just not sure what constitutes a change and what constitutes a swap.  It's like that Ship of Theseus's we often speak of… thus it's about identity, or definitions, if you will.  What is new?  What is old?

Could complexity really amount to some form a familiarity?  If you can relate well with X, it generally does not seem so complex.  If you can show people how X relates to Y, perhaps you have made X less complex?    We can model massive systems — like the weather, poster child of complexity — more accurately than ever.  If anything, everything has tended towards less complex, over time, when looked at from a certain vantage point.  Everything but the human heart. Heh.

I'm sure I'm doing a terrible job of explaining what I mean, but perhaps I can sum it up by saying that complexity is subjective/relative?  That complexity is an effect of different frames of reference and relation, as much as anything?

And that ironically, the relations that make things simple can also make them complex?  Because relations connect things to other things, and when you change one connected thing it can have knock-on effects and… oh no, I've logiced myself into knots!

How much does any of this relate to your comment?  To my original post?

Does "less complex" == "Good"?  And does that mean complexity is bad?  (Assuming complexity exists objectively of course, as it seems like it might be where we draw lines, almost arbitrarily, between relationships.)

Could it be that "good" AI is "simple" AI, and that's all there is to it?

Of course, then it is no real AI at all, because, by definition…

Sheesh!  It's Yin-Yangs all the way down[3]! ☯️🐢🐘➡️♾️

  1. ^

    Known unknowns can be related, given shape— unknown unknowns, less so

  2. ^
  3. ^

    there is no down in space (unless we mean towards the greatest nearby mass)

Load More