Random Attempts at Apllied Rationality
Using Credence Calibration for Everything
NLP and other Self-Improvement
The Grueling Subject
Medical Paradigms

Wiki Contributions


The numbers you find on the internet are that currently there are more slaves in the world than there were slaves in America at the time when all African-American were slaves. 

It's not merely a historic problem.

To me, your post looks like you lay out your own position without really engaging with why people hold the opposite position and strawman people by saying that they lack immagination. 

Quite recently jeffk wrote Consent Isn't Always Enough.

There's no need for anything being covert. NetDragon Websoft is already having a chatbot as CEO. That chatbot can get funds wired by giving orders to employees. 

If the chat bot would be a superintelligence, that would allow it to outcompete other companies. 

I think the news matters more as a signal of people thinking about making AI to be a CEO than what happens in this particular company. 

There are various different experiences that people have that they consider valuable.

I can read or consume a LessWrong post and consider that experience valuable. On the other hand, I might also write or produce a LessWrong post and consider that experience valuable. 

For every person, you can look at what percentage of their experiences that they value are consumption and what percentage of their experiences involve production. There are also other experiences like having a conversation with a friend, that are neither consumption nor production.

One way to define the term consumerism would be to say that if most experiences that people value are consumption experiences that's consumerism. 

The issue with a lot of consumerist experiences is that while they are nice, they don't feel deeply meaningful. It mostly feels more meaningful to write a LessWrong post than to read one, yet it's easier to read than to write.  If you look at metrics such as the number of close friends that the average American has, they seem to be going down.

When fewer people find the experience of tinkering and building new things intrinsically valuable and spent a lot of time with them, that's in turn bad for innovation overall in society. 


As an aside, the interview discusses David Graeber's bullshit job thesis. David Graeber defines a bullshit job as a job where the person doing the job thinks it's bullshit. When I hear about environmental impact assessments taking years, I would expect that it's easy for the person writing the assessment to think it's bullshit. 

I have a friend who does some number crunching for creating ESG numbers for a bank and who sees his work as bullshit work. There are a lot of bureaucratic rules that create bullshit work, and for the progress movement it's important to reduce that bullshit work that rules like NEPA produce. 

From a progress perspective, David Graeber was much better than your average person on the left. If you haven't seen it the debate where David Graeber and Peter Thiel agree on most things about the Great Stagnation is good. 

Yes, you need human cooperation but human cooperation isn't hard. You can easily pay people money to get them to do what you want. 

With time more processes can use robots instead of humans for doing physical work and if the AGI already has all the economic and political power there's nothing to stop the AGI from doing that.

The AGI might then reuse land that's currently used for growing food for other purposes and step by step reduce the amount of food that's available and there never needs to be a point where a human thinks that they are working for the destruction of humanity. 

Most actions by which actors increase their power aren't directly related to weapons. Existential danger comes from one AGI actor getting more power than human actors. 

A lot of the resources invested into "fighting misinformation" is about censoring nonestablishment voices and that often includes putting out misinformation like "Hunter's laptop was a Russian misinformation campaign" to facilitate political censorship. 

In that enviroment, someone who proposing a new truthseeking project might also be interested into treating a project to strengthen the ruling narrative or they might be interested in actual truthseeking that affirms the ruling narrative when it's right and challenges it if it's wrong. 

In a world where there's so much political pressure it probably takes strong conviction to have a project that does actual truthseeking instead of being coopted for narrative control.

Politics is the Mind-Killer there's no good reason to lead with examples that are this political. 

Do we know whether both use the same amount of compute?

Load More