Wiki Contributions

Comments

I don't think I ever ran into that when I was younger. Meeting in houses is the original way Christians met, so I think it would be weird to complain about it. I found it pretty common for people to make fun of the opposite. If you're spending your church money on a big fancy building, does that really show your dedication to church teachings like charity*?

Also, people might accuse a really small church group of being culty, but a small church group with a big fancy building feels much cultier than the same group meeting in a house.

I was only really exposed to Evangelical Christianity so it's possible this is very different among other groups like Catholics.

* Churches typically justify this in terms of practicality (more spaces to work with) and marketing evangelism.

This is evidence against the claim that debating opinions which are not widely held, or even considered "conspiracy theories", just gives them a platform and strengthens their credibility in the eyes of the public.

I'm not sure if this really applies here, since Lab Leak was never really treated as a crazy/fringe idea among rationalists. In fact, it looks like it was the majority opinion before the debate and ACX posts.

I think historically frying would have used olive oil or lard though.

Don't forget the standard diet advice of avoiding "processed foods". It's unclear what exactly the boundary is, but I think "oil that has been cooking for weeks" probably counts.

Having 1.6 million identical twins seems like a pretty huge advantage though.

Just curious, but if you found a big group house you liked where everyone had kids, would you be interested? I guess it would have to be a pretty big house.

You should probably take reverse-causation into account here. I doubt the effect of the school is nearly as strong as you think, since people who want finance jobs are drawn to the schools known for getting people finance jobs. Add to that that the schools known for certain things are the outliers. If you go to a random state school, the students are going to have much more varying interests.

Any chance you can link to that discussion? I'm really curious.

When people talk about p(doom) they generally mean the extinction risk directly from AI going rogue. The way I see it, that extinction-level risk is mostly self-replicating AI, and an AI that can design and build silicon chips (or whatever equivalent) can also build guns, and an AI designed to operate a gun doesn't seem more likely to be good at building silicon chips.

I do worry that AI in direct control of nuclear weapons would be an extinction risk, but for standard software engineering reasons (all software is terrible), not for AI-safety reasons. The good news is that I don't really think there's any good reason to put nuclear weapons directly in the hands of AI. The practical nuclear deterrent is submarines and they don't need particularly fast reactions to be effective.

Answer by Brendan LongApr 07, 202430

While military robots might be bad for other reasons, I don't really see the path from this to doom. If AI powered weaponry doesn't work as expected, it might kill some people, but it can't repair or replicate itself or make long-term plans, so it's not really an extinction risk.

Load More