All of Wes F's Comments + Replies

Nevermind. Another comments explained it. I would greatly appreciate that option also being put under settings! I would have found it much easier.

Putting it under settings does sound reasonable.

I would love an option to say "I don't want to read another word about AI alignment ever"

What? How? I've found like 3 different "customize" options, and none of them are this.

Side note: I've noticed that web & app developers these days try to make settings "intuitive" instead of just putting them all in one place, which I think is silly. Just put all settings under settings. Why on Earth are there multiple "customize" options?

3Wes F5mo
Nevermind. Another comments explained it. I would greatly appreciate that option also being put under settings! I would have found it much easier.

I read via RSS, and I mostly just skip LW articles because it's almost all AI-related, and I'm not interested in that. It would be very nice if I could get a non-AI RSS feed (or even better - a customizable RSS feed where I can include or exclude certain tags). 

It really does feel to me, though, that LessWrong is not for me because I'm not interested in AI alignment. LW doesn't strike me as a rationality-focused site. It's a site for talking about AI alignment that offers some rationality content as an afterthought. It sounds like you don't really want to change that, though, so it is what it is.

I disagree with this comment. Veaux seems awful, but a bad messenger doesn't make a bad message and his coauthor (one of his ex-partners and accusers) AFAICT, still endorses the book. In any case, I continue to believe it contains good relationship advice.

Can you say more about what you think is good about the advice, and why that book in particular is the best source of that advice? The co-author didn't immediately retract the book but had since said at least one of the core models strips people of defenses against abuse, and that that wasn't necessarily the worst thing in the book, just the easiest thing to point to. []

For anyone wanting to see what postrat twitter is all about but doesn't know where to start, I made of list of the people I was able to identify as ingroup, and I regularly add people. You can follow it here:

The LW podcast list seemed a little outdated and didn't provide much information, so I made a list of rationalist podcast. If you have a podcast and would like it added to the list, please let me know. For now I'm only including podcasts that are currently updating regularly

2Yoav Ravid2y
Are you talking about this []? If so, why didn't you just edit it?

I'm not understanding your disagreement. Of course popularity is just a prior. The less popular a given position, the stronger your prior should be against the other person holding it. Doing that will lead you to be less wrong about what the other person means by what they're saying.

What is the difference between what you said and "your prior for whether someone believes an unpopular position should be lower the less popular the position is, and you should update your prior based on how clear their statement was"?

I think "Survival Day" evokes the themes you're going for

I love this idea. Some of my thoughts:

I would like if the script for the event had some fill in the blanks where people can express themselves. Reading from a script can build connection, but more so if people are encouraged to show some of their own uniqueness as part of it.

Perhaps there could be a tradition where 20-30 or so people are invited and attend, but anyone is welcome to observe (i.e. join the video stream, but as read only or just have their mic muted). The worries about event size are real, but also it would be nice if people who didn't have an event to attend could still participate somehow, and technology allows that to happen in a non-intrusive way .

8Rana Dexsin3y
I would be very worried about that last idea turning into a “performance” in terms of social instinct. In what other social contexts do we have twenty or thirty people actively doing something while any number of people watch? When they're on stage…

I do not fully understand the point you are making in (1). I don't see anything specifically to disagree with, but also don't see how it's in conflict with anything in the OP. I hold that my feelings are my basic unit of value because that's what I care about. If a different person cares about different things, that's their decision. My feelings are in constant flux, and will often change. Is that somehow in conflict with something I've said? My thoughts on egoism are more fully fleshed out in the linked post.

I'm mostly ... (read more)

I don't know how correct this post is, but I highly approve of it as an effort to understand and humanize a pretty strong outgroup.

Theory 4: people only care about poor people in the in-group. People like welfare when it helps people similar to themselves, and hate it when it "enables" stupid lazy people in the out-group. This is often heavily racial - the countries with the most generous welfare systems are also the most racial homogeneous. So whether one votes in favor of welfare is determined by who they imagine being helped. This can often change rapidly, leaving public opinion on welfare to be a confusing mess.

This is a good post about how to become a more reliable person. I often find it incredibly frustrating when people flake on me, and I really know very few people who don't consistently flake on plans.

I think this is good advice for people who want to become more reliable. However, I often suspect that the people who consistently flake don't want to be more reliable. My suspicion is that people wish to signal behavior by *saying* they're going to do something without actually doing it. Much like a politician making campaign promises, there a

... (read more)
This seems true, but I suspect that advocating for people to be more accurate predictors (instead of "show up on time"), would make feel more reasonable (to both parties) to hold people to a higher standard.