Hey, thanks for posting this! It was a really interesting read. I do want to add though that church planting is not just a evangelical thing, my family goes to a centuries-old church that is part of is part of the free methodist denomination, and it has sent out multiple church plants over the past couple years. However, the method of planting tends to be somewhat different than what you described. The church has just one main pastor but around 6-7 other pastors, (although they might be called deacons in other denominations). They preach occasionally but mostly help out with one specific thing; for example there is a youth pastor, a communications pastor, and a social justice pastor. Every so often, one of these pastors will, with the support of our church, decide that they would be more fulfilled leading their own church, or that they want to set up a church in an area nearby they see as underserved. When this happens, they typically start a new church some distance away and bring along some families from our church with them. However, because of the way the free methodist denomination works, the new church is in no way subordinate to its parent church and usually grows to be its own separate entity (although the two churches stay close because the original planters still have relationships with the people from our church). Knowing how it is done in the free methodist denomination made this more interesting because of the contrast, so I thought I would provide this for other people to think about the differences!
This metaphor was pretty great, I did the same thing as a child!
I find this hilarious, but also a little scary. As in, I don't base my choices/morality off of what an AI says, but see in this article a possibility that I could be convinced to do so. It also makes me wonder, since LLM's are basically curated repositories of most everything that humans have written, if the true decision theory is just "do what most humans would do in this situation".
I guess my point is that there are diminishing diplomatic/power rewards from increasing the number of nuclear weapons in your stockpile. While having nuclear capability is certainly important to be considered a superpower, the advantage the US gains over China by having a nuclear arsenal way bigger than the the Chinese one is, in my view, relatively small. China still has enough nuclear weapons to make launching missiles at it a really bad idea for a president of the US who wants to keep his job/his party's political power/his citizens safe (even including the possible incompetency of China's nuclear force - see this report). Also, having a no first use policy would matter more if China's leader was bound by his countries laws, which he is unfortunately not.
On the other hand, China is definitely trying to build those alliances and the global influence that you speak of. One example would be the belt and road initiative, by which China is pouring money into low-income countries in Asia and Africa. Also, China not having a nuclear arsenal as big or as advanced delivery systems for warheads is somewhat irrelevant, since it still has an arsenal that could destroy all of the major American population centers more than twice over.
I took the Dark factor test and got a very low score, but I kept second-guessing myself on the answers. I did that because I wasn't sure what my actions in a real-life scenario would be. Even though I had good intentions and I believe that other people's well-being has inherent value, I would put a high probability that I would get at least a slightly higher score if this sort of test was a real-world test that I didn't know I was taking. That makes me pessimistic about the data that the authors cite in this article. If (for example) "over 16% of people agree or strongly agree that they 'would like to make some people suffer even if it meant that I would go to hell with them'" when they know they are being tested for malevolent traits, how many people actually would do that given the choice? Also - for people who believe in hell, I hope this question is scale insensitivity problem, since infinite time being tortured seems to me to have infinite negative utility, so you would need to value harming others more than helping yourself to agree with that statement.
I agree, but so many other things are different in this fan-fic and Eliezer is smart enough that I wouldn't be surprised if it turns out to bel like that for a reason.
This is a good example of a time when it would actually be worthwhile to know about the philosopher's zombie debate.
It says two comments for me before I posted this, so it looks like it has been fixed.
It may be that the denomination as a whole is classified evangelical, but our specific church definitely doesn't feel like it (Speaking as someone with experience). All the pastors went to seminary, even the youth pastor and the social justice pastor and there is less of an established hierarchy than most evangelical churches. The pastor is not in charge of the church, rather the elected board is, and there is less emphasis on spreading the bible and more on caring for the community that the church is in.