dripgrind

Posts

Sorted by New

Wiki Contributions

Comments

Coordination Problems in Evolution: The Rise of Eukaryotes

Nick Lane's book The Vital Question has a great discussion of endosymbiosis in terms of metabolism. The point of the book is that all metabolism is powered by a proton gradient. It becomes very inefficient to maintain that in a larger cell, so having smaller subcompartments within a larger cell where metabolism can take place (like mitochondria) is vital for getting bigger. (There are some giant bacteria, but they have unusual metabolic adaptations). I think he also discusses why mitochondria need to retain the key genes for metabolism - I think it's to do with timely regulation.

Thwarting a Catholic conversion?

[Executive summary: solve the underlying causes of your problem by becoming Pope]

I think it's a mistake to focus too much on the case of one particular convert to Catholicism simply because you know her personally. To do that is to fall prey to the availability heuristic.

The root cause of your problem with your friend is that the Catholic Church exists as a powerful and influential organisation which continues to promote its weird dogma, polluting Leah's mind along with millions of others. Before investing time and effort trying to flip her back to the side of reason, you should consider whether you could destroy the Church and dam the river of poison at its source. I will now outline a metho

On the unpopularity of cryonics: life sucks, but at least then you die

When I said "you assume people have to invest their own money to ensure their health" I was obviously referring to preventative medical interventions, which is what you were actually asking about, not cryonics.

The breast/ovarian cancer risk genes are BRCA 1/2 - I seem to remember reading that half of carriers opt for some kind of preventative surgery, although that was in a lifestyle magazine article called something like "I CUT OFF MY PERFECT BREASTS" so it may not be entirely reliable. I'm sure it's not just a tiny minority who opt for it, though. I'm sure there are better figures on Google Scholar.

If you consider the cost of taking statins from age 40 to 80, in total that's a pricy intervention.

Maybe the lack of people using expensive preventative measures is because few of them exist - or few of them have benefits which outweigh the side-effects/pain/costs - not that people don't want them in general. If there was a pill that cost $30,000 and made you immune to all cancer with no side effects, I'm sure everyone would want it.

I think the real issue is that people don't consider cryonics to be "healthcare". That seems reasonable, because it's a mixture of healthcare and time travel into an unknown future where you might be put in a zoo by robots for all anybody knows.

The $125,000 Summer Singularity Challenge

Only on this site would you see perfectly ordinary charity fundraising techniques described as "dark arts", while in the next article over, the community laments the poor image of the concept of beheading corpses and then making them rise again.

On the unpopularity of cryonics: life sucks, but at least then you die

Women with a high hereditary risk of breast cancer sometimes opt to have both their breasts removed pre-emptively. People take statins and blood pressure drugs for years to prevent heart attacks. Don't you have eye tests and dental checkups on a precautionary basis? There's plenty of preventative medical care.

Maybe the availability and marketing varies between countries - the fact that you assume people have to invest their own money to ensure their health suggests you're from the US or another country with a bad healthcare system. My country has a national health service which takes an interest in encouraging preventative medicines like statins, helping people give up smoking, and so on, since that saves it money overall. I'm sure the allocation of preventative care is far from ideal and shaped by political and social factors and drug company lobbying, but it does exist.

It would be a bad tradeoff to go through painful appendectomy to prevent the small chance that you might get appendicitis (and you can get your appendix removed when it's actually infected, and the appendix may have an evolutionary function acting as a reservoir of gut bacteria, and it can also be used to reconstruct the bladder).

My true rejection

You're right that the motivation would be obvious today (to a certain tiny subset of geeky people). But what if there had been a decade of rising anti-AI feeling amongst the general population before the assassinations? Marches, direct actions, carried out with animal-rights style fervour? I'm sure that could all be stirred up with the right fanfiction ("Harry Potter And The Monster In The Chinese Room").

I understand what ethical injunctions are - but would SIAI be bound by them given their apparent "torture someone to avoid trillions of people having to blink" hyper-utilitarianism?

My true rejection

To build a superintelligence that actually maximizes IBM's share price in a normal way that the CEO of IBM would >approve of would require solving the friendly AI problem but then changing a couple of lines of code.

That assumes that being Friendly to all of humanity is just as easy as being Friendly to a small subset.

Surely it's much harder to make all of humanity happy than to make IBM's stockholders happy? I mean, a FAI that does the latter is far less constrained, but it's still not going to convert the universe into computronium.

My true rejection

I'm not seriously suggesting that. Also, I am just some internet random and not affiliated with the SIAI.

I think my key point is that the dynamics of society are going to militate against deploying Friendly AI, even if it is shown to be possible. If I do a next draft I will drop the silly assassination point in favour of tracking AGI projects and lobbying to get them defunded if they look dangerous.

My true rejection

OK, what about the case where there's a CEV theory which can extrapolate the volition of all humans, or a subset of them? It's not suicide for you to tell the AI "coherently extrapolate my volition/the shareholders' volition". But it might be hell for the people whose interests aren't taken into account.

Load More