LESSWRONG
LW

1495
Viliam
25944Ω15963141
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
9Viliam's Shortform
5y
239
How I Became a 5x Engineer with Claude Code
Viliam1d20

I use Claude to generate code that I fix by hand. It is still less work than writing it myself. I am mostly using it on hobby projects (example) that probably otherwise wouldn't get done.

Reply
Narcissism, Echoism, and Sovereignism: A 4-D Model of Personality
Viliam1d20

The latter. Although different people draw the lines elsewhere, and also different people can do the same thing for different reasons.

Is it pathological if someone sends 50% of their salary to charity? Many people would say yes. Some people would say no.

Is it pathological to work yourself to death for a socially accepted reason, such as making enough money to get your children to an expensive university?

From my perspective, the causes of the harmful behavior for the items in the list are:

  • can't test reality
  • can't feel worthy
  • can't accept shame
  • can't accept being loved
  • can't accept other people dying for preventable reasons
  • can't feel preferences
  • can't test reality
  • can't accept immodesty
  • can't accept reality

That one item feels significantly different, although maybe it is just a matter of opinion.

Reply
Halfhaven virtual blogger camp
Viliam2d20

Yes, if the Discord invite link still works, please join. If it does not, tell me, I will post a new one later.

Reply
What is Lesswrong good for?
Viliam3d40

Yeah, "collective sense-making" feels right to me. Individual aspiring rationalists sometimes say crazy things, but the rest of the group usually corrects them when they do.

As opposed to (in my opinion typical) situations outside of Less Wrong where:

  • low-status people do not dare to say unexpected things
  • when high-status people say something, no one dares to contradict them

So either the truths do not appear, or the falsehoods do not disappear. Basically a group of normies is usually approximately as smart/sane as its highest-status member. Aspiring rationalists do better (although not perfectly) at merging individual knowledge into a smarter whole.

Thus you can have rare specialists talk about e.g. covid or crypto and have an impact on the community at large, as the community collectively evaluates it as probably correct. But this is not the same as "open-mindedness" as understood usually, because the community can also collectively reject various things; otherwise we would see here all kinds of scams and hype.

(In this context, it is worth paying special attention to various rationalist cults, but it seems to me that they all happened in the periphery of the community, at places isolated from the collective feedback. Again, some of us are individually quite insane, but we are collectively sane. Insanity prevails when a charismatic person succeeds at creating an isolated bubble of wannabe rationalists.)

Reply
Narcissism, Echoism, and Sovereignism: A 4-D Model of Personality
Viliam3d20

The person who proudly works themselves to death at a charity, because they value their own life as much as someone else’s and someone else is dying.

This one feels different from the remaining options, because there is no exploitative bad actor, no factual mistake, and in a hypothetical universe where no one is dying, this behavior would stop.

Reply
Dalmert's Shortform
Viliam4d20

I think the official Catholic version is that there is a... some kind of "lesser heaven"... for the unborn children. So they kinda win the lottery in avoiding hell, but lose the lottery in experiencing the full heaven.

Reply
AdamLacerdo's Shortform
Viliam4d30

If that's what will make solar panels more popular in USA, I say let's do it!

Reply
How do we know when something is deserving of welfare?
Viliam5d20

Not sure if this is obvious, but maybe instead of just yes and no we should assign numbers, like "X matters 10 times more than Y", and then it is obvious that Y matters, but also that it does not matter as much as X.

That would solve some philosophical issues, like we could decide that each particle in the universe has an inherent moral worth, it's just their moral worths, even taken together, are negligible.

And it would open other issues, like how to calculate it specifically, and whether it can even be added linearly (maybe 100 A's and 100 B's are worth more than 200 A's alone or 200 B's alone, because there is a bonus for diversity), etc.

Reply
Don't Mock Yourself
Viliam5d82

Sometimes people make a mistake because they desperately try to avoid making a different mistake. That's what sometimes locks them in the bad place: "but if I stop doing X... wouldn't that make me Y?"

There is another group of people who approximately never think a negative though about themselves, and it's narcissists. They know that it's everyone else who sucks and is responsible for everything bad.[1]

That could be an (unspoken) obstacle against getting rid of the self-negativity: "but won't that make me a narcissist?" or "but won't that make my parents/friends believe that I am a narcissist?".

Ironically, this behavior could have started in the past as an attempt to appease some narcissist in the victim's environment. "If I keep acknowledging that I suck, maybe they will stop attacking me so much?"

But there is a third option, which is simply to abandon the negative thoughts, without redirecting them.

  1. ^

    Some people insist that actually, deep down, the narcissists are deeply insecure, and their outward behavior is merely their desperate attempt to push that internal negativity away. Unless I get some data to support this, I am going to assume that this is just another case of the typical mind fallacy: someone who has negative thoughts about themselves failing to imagine that someone else might simply not have them. If it is possible for a healthy person to have no negative feelings about themselves, why wouldn't it be also possible for the right kind of unhealthy person?

Reply
Adele Lopez's Shortform
Viliam6d40

Thank you, the description is hilarious and depressing at the same time. I think I get it. (But I suspect there are also people who were already crazy when they came.)

I am probably still missing a lot of context, but the first idea that comes to my mind, is to copy the religious solution and do something like the Sunday at church, to synchronize the community. Choose a specific place and a repeating time (could be e.g. every other Saturday or whatever) where the rationalists are invited to come and listen to some kind of news and lectures.

Importantly, the news and lectures would be given by people vetted by the leaders of the rationality community. (So that e.g. Ziz cannot come and give a lecture on bicameral sleep.) I imagine e.g. 2 or 3 lectures/speeches on various topics that could be of interest to rationalists, and then someone give a summary about what things interesting to the community have happened since the last event, and what is going to happen before the next one. Afterwards, people either go home, or hang out together in smaller groups unofficially.

This would make it easier to communicate stuff to the community at large, and also draw a line between what is "officially endorsed" and what is not.

(I know how many people are allergic to copy religious things -- making a huge exception for Buddhism, or course -- but they do have a technology for handling some social problems.)

Reply
Load More
No wikitag contributions to display.
90Halfhaven virtual blogger camp
18d
9
32Wikipedia, but written by AIs
2mo
9
36Learned helplessness about "teaching to the test"
4mo
16
27[Book Translation] Three Days in Dwarfland
5mo
6
43The first AI war will be in your computer
6mo
10
110Two hemispheres - I do not think it means what you think it means
8mo
21
26Trying to be rational for the wrong reasons
1y
9
32How unusual is the fact that there is no AI monopoly?
Q
1y
Q
15
37An anti-inductive sequence
1y
10
30Some comments on intelligence
1y
5
Load More