Transfuturist

Wiki Contributions

Comments

Lesswrong 2016 Survey

Most of those who haven't ever been on Less Wrong will provide data for that distinction. It isn't noise.

Lesswrong 2016 Survey

This is a diaspora survey, for the pan-rationalist community.

Lesswrong 2016 Survey

I have taken the survey. I did not treat the metaphysical probabilities as though I had a measure over them, because I don't.

Open Thread Feb 22 - Feb 28, 2016

I guess the rejection is more based on the fact that his message seems like it violates deep-seated values on your end about how reality should work than his work being bullshit.

Lumifer rejects him because he thinks Simon Anhold is simply a person who isn't serious but a hippy.

How about you let Lumifer speak for Lumifer's rejection, rather than tilting at straw windmills?

Open Thread Feb 22 - Feb 28, 2016

The equivocation of 'created' in those four points are enough to ignore it entirely.

Open Thread Feb 22 - Feb 28, 2016

I'm curious why this was downvoted. The last statement, which has political context?

Open Thread Feb 22 - Feb 28, 2016

Are there any egoist arguments for (EA) aid in Africa? Does investment in Africa's stability and economic performance offer any instrumental benefit to a US citizen that does not care about the welfare of Africans terminally?

The Ultimate Testing Grounds

We don't need to describe the scenarios precisely physically. All we need to do is describe it in terms of the agent's epistemology, with the same sort of causal surgery as described in Eliezer's TDT. Full epistemological control means you can test your AI's decision system.

This is a more specific form of the simulational AI box. The rejection of simulational boxing I've seen relies on the AI being free to act and sense with no observation possible, treating it like a black box, and somehow gaining knowledge of the parent world through inconsistencies and probabilities and escaping using bugs in its containment program. White-box simulational boxing can completely compromise the AI's apparent reality and actual abilities.

Open thread, Oct. 5 - Oct. 11, 2015

Stagnation is actually a stable condition. It's "yay stability" vs. "boo instability," and "yay growth" vs. "boo stagnation."

Digital Immortality Map: How to collect enough information about yourself for future resurrection by AI

(Ve could also be copied, but it would require coping of all world).

Why would that be the case? And if it were the case, why would that be a problem?

Load More