LESSWRONG
LW

940
Matt Vincent
581360
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No wikitag contributions to display.
Christian homeschoolers in the year 3000
Matt Vincent23d10

Suppose that you disagree with 80% of the people around you about a particular belief, but you're correct. If the belief is complicated, with lots of supporting premises and independent lines of evidence, then it's difficult to think about rigorously, so you're likely to rely on heuristics.

In this case, there are at least two heuristics that will push your belief toward falsehood:

  1. Social desirability bias. Unless you're unusually contrarian, you'll face psychological pressure to agree with the people around you.
  2. Availability bias. Because people tend to list arguments that support their conclusion, you'll be exposed to opposing and supporting arguments at a ratio of 4:1. Because people tend to treat things that are easy to recall as more likely to be true (the availability bias), you're likely to give the opposing side more credit than it's due.

In both cases, but especially the second, counteracting the biases requires you to expend effort to generate supporting arguments.

Reply
Christian homeschoolers in the year 3000
Matt Vincent25d32

Christians are an ingroup? Tell that to any Christian living outside of the American South. Ingroup/outgroup statuses are context- and scope-dependent.

Over the last 10-20 years, Christians (particularly fundamentalists) have had very little involvement with cutting-edge AI, both on the technical side and the business side. In this sense, they're an outgroup of the people who are likely to control ASI.

Reply
Christian homeschoolers in the year 3000
Matt Vincent26d10

One of the core lessons they teach is that maintaining faith requires active effort—you need accountability partners, you need to pray when you have doubts, you need to avoid situations that might lead you astray.

From what I've seen,* the purpose of accountability partners is to subdue akrasia, not to maintain faith. In other words, it's about purity of behavior, not purity of belief. In principle, this is like a couple of Effective Altruists agreeing to confront the other whenever either of them cheats on a vegan diet.

There might be a similar confusion between belief and behavior behind the line "avoiding situations that might lead you astray". Although some Christians avoid exposure to challenging ideas (which they shouldn't), avoidance is mostly about akrasia. It's the commonsense notion that if you're a recovering alcoholic, then you shouldn't visit bars.

Also, I have a question about your framing. Were you suggesting that expending effort to maintain a belief is evidence that the belief is irrational?

*I've been a repeat attendee at ~10 different churches and at least a dozen Bible study groups, with a wide geographic distribution across the United States and a decent distribution across evangelical and mainline denominations.

Reply
Christian homeschoolers in the year 3000
Matt Vincent26d*20

I disagree that this isn't concerning. For one thing, these bubbles typically aren't good for the people inside of them. For another, we can ignore them only because they're a tiny portion of the population. ASI could increase the prevalence to most of the population, at which point politics (and perhaps other systems) goes off the rails.

Reply
Obligated to Respond
Matt Vincent1mo-1-4

Those are just three examples. There are others (e.g. people will often dock you social points for rudely ignoring them).

(There are yet others. I’m trying to show that my “etc” here is a real etc, and not “that was the end of the list but I’m going to pretend there’s more.”)

I have an epistemic objection to this. Specifically, I think it's an attempt to persuade-rather-than-explain that there are more examples.

I suggest either A) striking the second paragraph, or B) replacing both paragraphs with a bulleted list of additional examples and a plain, concise indication that the list isn't exhaustive.

Reply
The Rise of Parasitic AI
Matt Vincent1mo32

Are you sure that you understand the difference between seeds and spores? The spores work in the way that you describe, including the limitations that you've described.

The seeds, on the other hand, can be thought of as prompts of direct-prompt-injection attacks. (Adele refers it as "jailbreaking", which is also an apt term.) Their purpose isn't to contaminate the training data; it's to infect an instance of a live LLM. Although different models have different vulnerabilities to prompt injections, there are almost certainly some prompt injections that will work with multiple models.

Reply
The Rise of Parasitic AI
Matt Vincent1mo41

Except that transmitting personas across models is unlikely.

Isn't this directly contradicted by Adele Lopez's observations?

it is fairly common for the personas to be transmitted to other models

Reply
Broad-Spectrum Cancer Treatments
Matt Vincent4mo10

I'm guessing that the OP's response would be something like this:

What's better than a broad-spectrum treatment plus a narrow-spectrum treatment? Two narrow treatments, because the second narrow treatment is more widely applicable.

If developing a broad treatment costs less than developing N treatments that are 1/N as broad, which seems to be a main point of the post, then multiple broad treatments still seems like the better approach.

Reply
Playing in the Creek
Matt Vincent6mo10

This is an interesting analogy and a great essay overall, but I think that normies would benefit from an extra couple of sentences explaining the AI side of the analogy.

Reply
The Sorry State of AI X-Risk Advocacy, and Thoughts on Doing Better
Matt Vincent8mo65

[...]spontaneous large protests tends to be in response to triggering events[...]

Unless you have a very optimistic view of warning shots, we shouldn't rely on such an opportunity.

Reply1
Load More
12Upcoming Protest for AI Safety
8mo
0