topynate

topynate's Comments

The Useful Idea of Truth

The first image is a dead hotlink. It's in the internet archive and I've uploaded it to imgur.

LINK: Superrationality and DAOs

That very much depends on what you choose to regard as the 'true nature' of the AI. In other words we're flirting with reification fallacy by regarding the AI as a whole as 'living on the blockchain', or even being 'driven' by the blockchain. It's important to fix in mind what makes the blockchain important to such an AI and to its autonomy. This, I believe, is always the financial aspect. The on-blockchain process is autonomous precisely because it can directly control resources; it loses autonomy in so far as its control of resources no longer fulfils its goals. If you wish, you can consider the part of the AI which verifies correct computation and interfaces with 'financial reality' as being its real locus of selfhood, but bear in mind that even the goal description/fulfilment logic can be zero-knowledge proofed and so exist off-chain. From my perspective, the on-chain component of such an AI looks a lot more like a combination of robotic arm and error-checking module.

Confound it! Correlation is (usually) not causation! But why not?

There's an asymptotic approximation in the OEIS: a(n) ~ n!2^(n(n-1)/2)/(M*p^n), with M and p constants. So log(a(n)) = O(n^2), as opposed to log(2^n) = O(n), log(n!) = O(n log(n)), log(n^n) = O(n log(n)).

War and/or Peace (2/8)

I want a training session in Unrestrained Pessimism.

Learning languages efficiently.

As someone who moved to Israel at the age of 25 with very minimal Hebrew (almost certainly worse than yours), went to an ulpan for five months and then served in the IDF for 18 months while somehow avoiding the 3 month language course I certainly should have been placed in based on my middle-of-ulpan level of fluency:

Ulpan (not army ulpan, real ulpan) is actually pretty good at doing what it's supposed to. I had a great time - it depends on the ulpan but I haven't heard of a single one that would be psychologically damaging. Perhaps your experience with a less intensive system as a minor has coloured your views? I know that I got put off Hebrew by the quality of teaching I had around the age of 11-13. I'm not sure if you could get benefits to do a free course (it would depend on your status) but that would certainly take off the pressure to learn Hebrew quickly. You'd have to delay your draft date, which is usually possible.

'Army ulpan' is, according to my friends, a bit of a joke, but that's three months you'd be with a bunch of Anglos, being taught by 19 year old girls, and going on semi-regular day trips, which is fun, rather than jumping straight into basic training, which sucks. It's also three months less time being bored to tears at the end of your service doing the same thing you've been doing the last two years.

You can't learn spoken Hebrew by reading. No way. Not only do you need grammatical knowledge to know which vowels should be used, but the spoken and written forms become quite divergent above the most basic level. You need to speak and hear Hebrew for most of the day, every day - which could be a pretty lonely experience in the US. Think Hebrew pop music, armed with a copy of the lyrics and the translation. Learn the songs and what they mean - it's just repetition - and you'll automatically pick up the most common vocabulary. Hebrew grammar isn't that hard for an English speaker, the verb conjugation is traditionally considered the hard part, and that's mostly just memorization. Genders are a pain but not knowing the gender of a word won't impair comprehension if you guess wrongly.

Recreational Cryonics

Then perhaps my assessment was mistaken! But in any case, I wasn't referring to the broad idea of cryonics patients ending up in deathcubes, but of their becoming open-access in an exploitative society - c.f. the Egan short.

I Will Pay $500 To Anyone Who Can Convince Me To Cancel My Cryonics Subscription

It is likely that you would not wish for your brain-state to be available to all-and-sundry, subjecting you to the possibility of being simulated according to their whims. However, you know nothing about the ethics of the society that will exist when the technology to extract and run your brain-state is developed. Thus you are taking a risk of a negative outcome that may be less attractive to you than mere non-existence.

Load More