Posts

Sorted by New

Wiki Contributions

Comments

So the real question is: "How will one's credibility be affected in the environment where the idea is presented?" which most likely depends on one's current credibility.

As of now, I don't have much karma so my risk of putting out poor ideas is more detrimental to this screen name. Eliezer could probably sneak in an entire subtly ludicrous paragraph that might go unnoticed for a while.

He has a history in reader's minds as well as the karma metric to make people ignore that flash in the back of their minds that something was off. They are more likely to think it was their own abberant thinking or that they had a flawed interpretation of a non-ludicrous idea he was trying to convey.

So it guess it just depends on how solid you think your idea and reputation are in making the decision on when to release an idea to a particular audience.

This kind of brings up the quality of thought that is spent on a subject. Someone with a strong ability to be self-criticizing can more effectively find flaws and come to better conclusions quicker. Those who contemplate on ideas with wrong, but unshakeable (or invisible rather) assumptions, will stew in poor circles until death. The idea of a comforting or powerful diety, unfortunately, sticks so hard when indoctrinated early and consistently.

While I'd have a difficult time pinning myself as either introvert or extrovert, I notice when I'm with a comfortable crowd, ideas will fall out of my mouth with so little processing that many sentences end with "... wait, nevermind, scratch that." I'll use my close aquaintences as easy parallel processing or to quickly look at ideas from obvious viewpoints that I tend to easily overlook.

When I'm in an unfamiliar group or setting, I'll often spend so long revising what I want to say that the conversation moves on and I've hardly said a word for 20 minutes.

This reminds me of an idea I had after first learning about the singularity. I assumed that once we are uploaded into a computer, a large percentage of our memories could be recovered in detail, digitized, reconstructed and categorized and then you would have the opportunity to let other people view your life history (assuming that minds in a singularity are past silly notions of privacy and embarrassment or whatever).

That means all those 'in your head' comments that you make when having conversations might be up for review or to be laugh at. Every now and then I make comments in my head that are intended for a transhuman audience when watching a reconstruction of my life.

The idea actually has roots in my attempt to understand a heaven that existed outside of time, back when I was a believer. If heaven was not bound by time and I 'met the requirements', I was already up there looking down at a time-line version of my experience on earth. I knew for sure I'd be interested in my own life so I'd talk to the (hopefully existing) me in heaven.

On another note, I've been wanting to write a sci-fi story where a person slowly discovers they are an artificial intelligence led to believe they're human and are being raised on a virtual earth. The idea is that they are designed to empathize with humanity to create a Friendly AI. The person starts gaining either superpowers or super-cognition as the simulators start become convinced the AI person will use their power for good over evil. Maybe even have some evil AIs from the same experiment to fight. If anyone wants to steal this idea, go for it.

Assuming I understood this correctly, you're saying an true AI might find our morality as arbitrary as we would consider pebble heap sizes, say bugger the lot of us and turn us into biomass for its nano-furnace.

Could you not argue Occam's Razor from the conjunction fallacy? The more components that are required to be true, the less likely they are all simultaneously true. Propositions with less components are therefore more likely, or does that not follow?

(Deuteronomy 13:7-11)

Talk about a successful meme strategy! No wonder we still have this religion today. It killed off its competitors.

With as scary as Anosognia sounds, we could be blocking out alien brain slugs for all we know.

We're really only just now able to identify these risks and start posing theoretical solutions to be attempted. Our ability to recognize and realistically respond to these threats is catching up. I think saying that we lack good self-preservation mechanisms is to criticize a little unfairly.

idlewire15y290

You wouldn't give up one IQ point for say 10 million dollars? It would be a painful decision, but I'm convinced I could have a much better effect on the world with a massive financial head start at only the slightest detriment of my intelligence. A large enough sum of money would afford me the chance to stop working and study and research the rest of my life, probably leading me to be more intelligent in the long run. Right now, I have to waste away my time with a superior level of intelligence just pay for food, shelter and student loans.

Load More