Posts

Sorted by New

Wiki Contributions

Comments

In the same spirit, some questions on the post itself:

-Could you be flattering rationalists here by telling them all their debates and disagreements are signs of their healthy culture? -Could you be using contrarianism as a badge of identity yourself, a way to find community with fellow contrarians? -Are you sure you're not using your description of 'woke' culture as a way to attack what is here an outgroup, rather than as a fair description of a purity phenomenon that happens in many ideologies?

Not saying I know the answer to these questions but always worth turning the lightbeam inwards now and then.

I guess it'd be helpful to understand more about why you think class consciousness is in conflict with using "reason, negotiation, justice, goal factoring and pulling the rope sideways".

I would think (decent) trade union activity was precisely interested in reasonable negotiations targeted at justice for a group of people.

Automating much of the economy is more than a little way off, and is highly likely to bring its own problems which I would expect to cross-cut with all these issues. I personally doubt that –in the event humans are not sidelined altogether – advances in AI would make demographic transition much economically easier, but I think that's in the realm of speculation either way.

I replied before your edit so a bit more:

I agree that civilisational progress is fairly fragile. But it is fragile in both directions. Climate change and resource wars seem about as likely to lead to global conflict as internecine ethnic strife to me.

I say this partly because immigration seems like a force for mutual cultural understanding and trade, to me. Without it we would probably see more closed-off nations, more likely to go to war.  With too much of it, however, there can be bad side effects and cultural rifts if not managed very wisely.  Where the line is is no simple question.

I also want to advance the simple main idea that drives my views on this issue, which is that population growth HAS to level off eventually unless we colonise space. The side effects on the economy will equally have to be managed at one time or another.

Will they be easier to manage in the future? Or could growing populations make it even harder? Could managing a fall in population rates be easier if done more slowly?

Maybe. But I don't feel that's the tenor of the arguments I am hearing from rationalist and adjacent people right now.

Do you think that a large population that was reducing slowly would be something Zvi, Robin Hanson and others taking this stance would celebrate? (As opposed to what we have a large population that is growing but showing signs of falling relatively fast in geographical/cultural pockets?)

Currently global population growth is positive but decelerating, I guess a more gradual deceleration would be less disturbing to them? But what about if world population growth very gradually moved from positive to negative? Would they be happy with that?

I had assumed not but I am trying to understand what good looks like.

So is the target to keep the population as it is? Has an argument been made as to why the current population is 'correct'? Isn't it a bit arbitrary?

All the same thoughts here. I also want to understand what the plan is if we keep growing the population. Is the idea that we keep going until we reach a higher stable number, or that we literally keep growing always? If the former, what's the number and why? If the latter, does that mean the whole strategy is 100% dependent on us inhabiting space? And if that's the case, shouldn't this rather big element in the plan be made explicit?

No, I think gene manipulation can be the right thing to do but that we should face harsh legal consequences if we cause harm by doing it with anything less than extreme levels of care and caution (I think the idiot god should be put on trial frequently as well, but he is sadly hard to handcuff).

I don't disagree with any of this. But if someone commit crimes against humanity in the name of eugenics, even if by accident, the fact that the blind, amoral god's actions are even worse is in no way exculpatory. In other words, he can get away with it, you can't.

Don't you think someone whose bike has been stolen realises they should have locked it afterwards without you telling them? Saying so may be fine but it actually depends how you tell them, I can imagine "Shoulda locked it" being a pretty annoying comment.

Load More