I recently gave a talk at Chicago Ideas Week on adapting Turing Tests to have better, less mindkill-y arguments, and this is the precis for folks who would prefer not to sit through the video (which is available here).

Conventional Turing Tests check whether a programmer can build a convincing facsimile of a human conversationalist.   The test has turned out to reveal less about machine intelligence than human intelligence.  (Anger is really easy to fake, since fights can end up a little more Markov chain-y, where you only need to reply to the most recent rejoinder and can ignore what came before).  Since normal Turing Tests made us think more about our model of human conversation, economist Bryan Caplan came up with a way to use them to make us think more usefully about our models of our enemies.

After Paul Krugman disparaged Caplan's brand of libertarian economics, Caplan challenged him to an ideological Turing Test, where both players would be human, but would be trying to accurately imitate each other.  Caplan and Krugman would each answer questions about their true beliefs honestly, and then would fill out the questionaire again in persona inimici - trying to guess the answers given by the other side.  Caplan was willing to bet that he understood Krugman's position well enough to mimic it, but Krugman would be easily spotted as a fake!Caplan.

Krugman didn't take him up on the offer, but I've run a couple iterations of the test for my religion/philosophy blog.  The first year, some of the most interesting results were the proxy variables people were using, that weren't as strong as indicators as the judges thought.  (One Catholic coasted through to victory as a faux atheist, since many of the atheist judges thought there was no way a Christian would appreciate the webcomic SMBC).

The trouble was, the Christians did a lot better, since it turned out I had written boring, easy to guess questions for the true and faux atheists.  The second year, I wrote weirder questions, and the answers were a lot more diverse and surprising (and a number of the atheist participants called out each other as fakes or just plain wrong, since we'd gotten past the shallow questions from year one, and there's a lot of philosophical diversity within atheism).

The exercise made people get curious about what it was their opponents actually thought and why.  It helped people spot incorrect stereotypes of an opposing side and faultlines they'd been ignoring within their own.  Personally, (and according to other participants) it helped me have an argument less antagonistically.  Instead of just trying to find enough of a weak point to discomfit my opponent, I was trying to build up a model of how they thought, and I needed their help to do it.  

Taking a calm, inquisitive look at an opponent's position might teach me that my position is wrong, or has a gap I need to investigate.  But even if my opponent is just as wrong as zer seemed, there's still a benefit to me.  Having a really detailed, accurate model of zer position may help me show them why it's wrong, since now I can see exactly where it rasps against reality.  And even if my conversation isn't helpful to them, it's interesting for me to see what they were missing.  I may be correct in this particular argument, but the odds are good that I share the rationalist weak-point that is keeping them from noticing the error.  I'd like to be able to see it more clearly so I can try and spot it in my own thought.  (Think of this as the shift from "How the hell can you be so dumb?!" to "How the hell can you be so dumb?").

When I get angry, I'm satisfied when I beat my interlocutor.  When I get curious, I'm only satisfied when I learn something new.

New Comment
96 comments, sorted by Click to highlight new comments since: Today at 10:37 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

Documenting my mental processes after reading this post (disclaimer: human introspection sucks, and mine is probably no exception):

  1. Huh, this is one of the better versions of the Devil's advocate game I've ever encountered... Immediate upvote.

  2. Huh, the poster analyzed their mistakes, learned from them and improved the challenge. Too bad I only have one upvote.

  3. Clicking on the links... WTF, this is the girl who converted to Christianity (Catholicism? Really? Out of all the options available?) from Atheism a year or so ago... Anything she posts deserves a downvote...

  4. Stop! What the hell am I doing? This is, like, falling prey to several biases at once. At least I should notice that I am confused. Unable to reconcile the "obviously dumb" conversion move with this quite clever post.

  5. Wait, this is the substance of her post, to begin with!

  6. Deciding to definitely keep the upvote and reserve judgment until after looking through the linked posts.

[-][anonymous]11y220

Even God can quote Bayes when it suits him.

Still upvoted for raw cleverness, though.

Bayes was a priest, after all. Now divine quote of gay Turing would be a different feat altogether.

8fubarobfusco11y
... or polyamorous agnostic Russell, maybe? (Also, Bayes was a Presbyterian minister — not a priest, which (in England) would imply Catholic or Anglican. It was the family trade; his father was also a minister.)
2Qiaochu_Yuan11y
I'm not sure I know how to parse this.
5wedrifid11y
Showing results for: Divine quotation of gay Turing * God quoting Turing would be more remarkable than got quoting Bayes because the latter was a priest (and so already affiliated with God) while the former is notoriously homosexual (while God is allegedly violently homophobic).
0Eugine_Nier11y
So? God is still willing to work with (and through) sinners.
1wedrifid11y
It isn't my position. Merely one I translated into well formed English. Any questions should be directed to the original source.
0Qiaochu_Yuan11y
The word I had trouble parsing was "of." I think ESRogs' hypothesis is probably correct, though.
2wedrifid11y
That seems highly unlikely: it would make prase's comment not fit the context. I think you have been misled.
0Qiaochu_Yuan11y
Oh, hmm. I got confused about what ESRogs' hypothesis actually implied. Never mind. Anyway, I agree with your interpretation but still think the original phrasing was quite confusing.
2wedrifid11y
Very much so. Without context the intended meaning would definitely not be the top of the hypothesis list.
2prase11y
Wedrifid's interpretation is the intended one. I agree that the chosen formulation wasn't particularly clear.
0ESRogs11y
I think that should be read as 'by' rather than 'of'.
2shminux11y
You mean the Devil, surely.

Potato potato.

6pedanterrific11y
Huh, it works even better in text with undifferentiated spelling. I'll have to remember that one.
-11Eugine_Nier11y

Ha!

I think the post is excellent, and I appreciated shminux's sharing his mental walkthrough.

On that same front, I find the Never-Trust-A-[Fill-in-the-blank] idea just bad. The fact that someone's wrong on something significant does not mean they are wrong on everything. This goes the other way; field experts often believe they have similar expertise on everything, and they don't.

One quibble with the OP: I don't think a computer can pass a Turing Test, and I don't think it's close. The main issues with some past tests are that some of the humans don't try hard to be human; there should be a reward for a human who gets called a human in those tests.

Finally, I no longer understand the divide between Discuss and Main. If this isn't Main-worthy, I don't get it. If we're making Main something different... what is it?

7palladias11y
There is a reward for Most Human Human (and a book by that same title I cite from in the longer talk I gave linked at the top). The computers can pass sometimes, and the author makes basically the same argument as you do -- the humans aren't trying hard enough to steer the conversation to hard topics.
6ESRogs11y
The difference between Discussion and Main is that Main is hard to find. If it's in Main and not Recently Promoted, I don't know how you're supposed to ever see it -- is everybody else using RSS feeds or something?
6John_Maxwell11y
I look at the sidebar on the right or visit http://lesswrong.com/r/all/recentposts/
0palladias11y
Yeah, I use an RSS for Main.
6[anonymous]11y
It remains evidence, however; to ignore such is the fallacy of gray.
7Qiaochu_Yuan11y
Yes, but it's almost certainly evidence that people on LW overweight relative to other evidence because atheism is an excessively salient feature of the local memeplex.
5Eugine_Nier11y
Interesting, I was under the impression that most people around here were fairly good about not doing this. However, it's possible I haven't been paying attention recently.
-2Epiphany11y
Thanks for being so real. That was refreshing.

I'm having trouble determining the best strategy in these kinds of games, but I'm worried it's not quite actually sounding like a member of the group you're pretending to be.

For example, a liberal Christian complained that her (honest!) Christian answer did very poorly, because people associated liberalism with atheism. This suggests that the best strategy isn't necessarily to honestly list what you believe, but to list what you think a typical member of the group involved believes.

And If (for example) atheists know that the average Christian is writing about what they think the average Christian believes, than atheists in their fake entries will also write about what they think the average Christian believes.

Yes, if overdone, this is a sign of dishonesty; for example, anyone who was too stereotypical ("Yeah, I get up each day, read a selection from the Bible, check the Pope's Twitter account, then go to church, then go bomb an abortion clinic..." would be obviously fake.) So the best strategy seems to write something kind of stereotypical, but to depart from stereotype in a few places so as to signal you're talking about a real person rather than a straw man.

But this str... (read more)

The atheists and Christians were told to be honest when writing their own responses. So they shouldn't have been trying to game it in this way.

For year three, I've been thinking of doing just this:

I'd be interested in seeing differences between this test and one in which, say, Christians were just asked to discuss their opinions on some topics without it being part of a Turing Test, and then atheists were asked to fake Christian opinions on those same topics

On the topic of marriage, since people conceive of the institution of having really different purposes but usually get bogged down of the question of what laws should exist. I thought the question of "How should a couple decide whether to get married?" would provoke interesting responses.

The atheists and Christians were told to be honest when writing their own responses. So they shouldn't have been trying to game it in this way.

"Honest" leaves a lot of wiggle room. If I were trying to write my honest atheist entry, what do I emphasize? That I hate scholastic philosophy and think religion set ethics back five hundred years? Or how I love C.S. Lewis and G.K. Chesterton and find many religious works to be among the most sublime creations of humankind? Both would be "honest".

Even if someone genuinely sets out not to present themselves at all, I still would expect presentation to be their main concern. There's a certain class of things which are impossible to do naturally. For example, if you try to count your natural respiratory rate, you will fail miserably; the fact that you're thinking about your breath immediately shifts it to consciously deciding what it is going to be. In my case, it makes it slower than normal. I can try to then consciously adjust by speeding it up, but since I don't know how much to speed it up, attempting to breathe naturally is basically just me trying to fake my natural breathing rate, probably badly.

I think self-presentation attempts of this sort raise some of the same problems.

5ChristianKl11y
It depends how you define poorly. Her answer demostrated something useful about inaccurate stereotypes of Christianity. If the goal of the whole exercise is to convince others that Christianity is right, then her answer might be good because it teaches people about their misconceptions about Christianity.
4Paul Crowley11y
Yes. If you're faking it, the measure is how many people you fool. If you're guessing, the measure is how many you get right. But if you're writing honestly, there's no winning or losing; just write honestly, and if people guess you wrong more fool them.
2ChristianKl11y
I don't think you understand the point of the game. The goal of the game isn't to guess the teachers password. palladias converted to Catholism after running that game. That's a win for the catholics in the game who honestly explained catholicsm to her. Of of the catholics wrote that he likes SMBC. That's one of the examples that struck out to palladias. Even when it reduced the judging scores of the answer, I think that answer likely increase the chances of "turning" palladias.
3Paul Crowley11y
Ah, so you're saying that the goal of the honest participant is for the guessers to distinguish correctly, showing that their counterparts have a poor understanding of their beliefs?
1Ishaan11y
Wait, did that actually happen? Is there a place where I can read about how and why?
0Kindly11y
Your argument is too general: it applies to any game. If I play chess against a Catholic, who deliberately throws the game in order to make a clever argument that succeeds in converting me to Catholicism, that counts as a win of some sort... but not a win in chess.
1ChristianKl11y
I think that this game is inherently about showing that your ideology is better than the one of the people on the other side. Chess is generally not played with that intent.
0BlazeOrangeDeer11y
I think "poorly" in this case meant that it wasn't rated very believable by the judges.
0ChristianKl11y
Yes, I think that's a bad definition of poorly. The goal of the game isn't only to get high ratings from the judges but to ultimately show people that your beliefs are better than the beliefs of the other side.
2[anonymous]11y
I had read this, when it was originally posted. And then, I was referred to this, which was also written by you: http://slatestarcodex.com/2013/03/03/reactionary-philosophy-in-an-enormous-planet-sized-nutshell/ Which was sufficiently good at espousing Reactionary philosophy that I was STARTLED when I got to the end, because I had forgotten that you were only pretending to be Reactionary for the sake of an Ideological turing test. You were well on your way to convincing me to take a hard look at my own progressive ideals and find out why I hadn't seen all of these obvious flaws and then you said: Despite the fact, that you had literally said, at the beginning: I seem to have forgotten that while reading the middle... So erm, yes, I understand that you don't hold those ideas, and I'm not angry at you. But I do apparently fail at reading comprehension. And at having justifications for my ideals. But reading this IN LIGHT of you saying a short time ago That's just weird. I'm having a hard time visualizing room for there to even be a better strategy than what you just did. It's rather embarrassing to admit that I failed at reading comprehension, but the contrast seems to great to not mention.
1shminux11y
Yvain might be a brilliant doctor, now or some day, but what he writes is already genius. If only he realized that he could help more people and make more money if he seriously considered this as a career. The case of an altruistic lawyer volunteering in a soup kitchen comes to mind.
4gjm11y
It isn't at all obvious to me that he could help more people and make more money by making his career in writing. (I mean, obviously it's possible that he would, but you can't mean that because it's pretty much always true for any pair of careers.) Just what sort of writing career do you envisage for him that's more lucrative and more world-enhancing than medicine? (For the avoidance of doubt: I agree that his writing is excellent.)
3shminux11y
Actually, I take it back. It's not a dichotomy. He can be both and he will probably be a better writer if he is also a practicing psychiatrist. He might decide to write professionally at some point, though.
1Eugine_Nier11y
One think you're analysis neglected is how the judges will adjust their strategy in response to these developments.
0DanArmak11y
In other words, the test should have blinded the participants.
0Luke_A_Somers11y
One of my only two errors on the christian side of year 2 was to suspect that a stereotypical Christian was a faker who was aiming for dead center. The other was an atheist who nailed the periphery. So, the strategy of lying or selectively choosing topics to seem more typical within your group would not have worked on me. I do think the whole 'I went to seminary' thing might best in the future be ruled out. It's one thing to create a fictional persona. It's another to give them a position of authority.
2Scott Alexander11y
I don't even think it's about authority. Another person talked about how Christianity helped them through their drug addiction. Because there really are Christians who have been helped through drug addictions, but most contestants would have respected the spirit of the test too much to try the somewhat different exercise of making up a completely fake personality with a fake life history, this provided strong evidence of real Christianity.
0DanArmak11y
Isn't the spirit of the test to be as convincing as possible? Imagining and imitating a fake persona in detail is exactly what the test asks for.
0metatroll11y
As Angleton once told me, there's a W.C.Fields made every minute.

I seem to be simultaneously freakishly good and bad at this game - I have, on multiple occasions and for multiple mappings of "green" and "blue," been accused of being a green pretending to be a blue (I am in fact blue,) and somehow I regularly find myself discussing the finer shades of green with greens who assume I am green. (It is hard for me to think of things that are funner than this.)

On Will Newsome's IRC channel someone mentioned the idea that you could totally automate the ITT into a mass-league game with elo ratings and everything (assuming there was some way to verify true beliefs at the beginning.) Make it happen, somebody.

8marchdown11y
Ooh, this would be so great!
5Viliam_Bur11y
Measuring the outcome is good, but I see a problem with the original data. How do you know who is really Green and who is really Blue? By their self-reports, right? Well, I see a problem here. What if someone insists on self-describing as a Blue, but most Blues disagree with him and say he is completely confused about what Blue-ness is? -- I know the definition of Blue is not exact, but it at least roughly corresponds to something in the idea-space, and a person can get it wrong and self-identify as a Blue despite being somewhere else. (Perhaps somewhere beyond both typical Blue and Green areas, so the person self-identifies as a Blue simply because they use Blue as a synonym for non-Green.) -- If other people fail to recognize such person as a Blue, is it really their fault? The question is not exactly "whom to blame?", but rather "if we use noisy inputs and then get noisy outputs, does it tell us something beyond the fact that there was a noise in input?" (To be specific, I remember someone in the ideological test saying that they self-identify as both Christian and Atheist. And it was 1 person in 13, so that has a non-trivial impact on the results. I don't think that majority of either Christians or Atheists would agree that an opinion like this is a valid representation of their opinions. So how exactly should guessing or not guessing this person's self-description influence the ratings? And should it influence the ratings if the same person would be forced to choose only one of the descriptions?)
7FeepingCreature11y
"Christ was not the Son of God, because there is no God, but we should follow his teachings anyways"?
2Eugine_Nier11y
Maybe Christianity is hermeneuticly true.
1Viliam_Bur11y
I don't remember, but most likely something like this. (Maybe with some "cosmic law" or "cosmic energy" added for better effect.) Now this completely does not represent Christian viewpoint (we should follow Christ precisely because he told us what God wants) or atheist viewpoint (even if Christ was a good and smart person, it is unlikely he got everything right; and even he got something right, we can discover and prove it independently).
2DanArmak11y
Sometimes there are many tinges of Blues. And for almost every tinge you pick, most other Blues will claim people of that tinge are not really Blue. (Religious and ideological movements get like this a lot.) But Greens have no problem classifying people as Blue and non-Blue, so it's not a wholly useless concept.
2Oligopsony11y
Well, that depends on what the test is testing for. If it's about metaphysics, Atheist, if it's about practice, Christian.
7Will_Newsome11y
puts on Hanson hat Atheism/theism isn't about metaphysics.
2ChristianKl11y
There a 50% chance that God exists?
2orthonormal11y
Forget trying to use people's actual beliefs anywhere in the process; it's simpler just to let people play the ITT for a lot of disjoint positions, so that they only get the bonus for their actual beliefs at most once. This mildly penalizes people with extremely idiosyncratic beliefs, but such people wouldn't even be able to play the current ITT.
2Will_Newsome11y
The somebody could only be a few programmers hired/recruited by CFAR working with direction from Leah. Basically Leah would have to get some people Anna respects to agree the idea is good and then talk to Anna about it. But presumably Anna and CFAR generally are really busy, so, it probably won't go anywhere in any case.
4Pavitra11y
Not really relevant here, but I only just now got the pun in CFAR's acronym.
1[anonymous]9y
I'm from the future. Thanks for telling me this. I hadn't realized this despite seeing the name for years.

A concern regarding this kind of test when applied to groups (Christians vs Atheists, for instance) rather than individuals is that one umbrella term may take more views than another, making the guessing game more/less tricky.

Nevertheless, this is a neat idea, particularly for particular people rather than groups as a whole.

3Manfred11y
Hm. I think this actually also applies to individuals, doing it with groups just has enough statistical power to beat us over the head with it. Which is to say, it's a good exercise but not a very good competition.

Considering the different sizes of the targets, I'm not sure what all this means. Like, while there are hundreds of denominations of christianity (though that seriously overrepresented Catholicism), atheism is barely more specific than 'Other' in terms of moral foundations and systems.

As long as you're comparing groups with different degrees of dispersion, it is going to be trickier for one side than the other. The more degrees of definition, the more opportunities to miss one as an outsider and slip up.

I find it very plausible that Christians are better able to pretend to be atheists than vice versa. But what follows from that?

Caplan claimed in his original piece:

the ability to pass ideological Turing tests—to state opposing views as clearly and persuasively as their proponents—is a genuine symptom of objectivity and wisdom.

Caplan gives little in the way of argument in support of this claim, and I'm not at all sure that it's true. "Genuine symptom of objectivity and wisdom", really? My objections follow.

First, there's only one way to be rig... (read more)

2Salivanth11y
"If you are Christian, then you probably know the Bible in detail, you are probably familiar with a range of theological and apologetic texts" I'll admit I don't have any statistics here, but from what I've seen heard, both first-hand and second-hand, Christians tend to be quite poor on average at knowing the Bible. I've never heard any evidence suggesting the average Christian has a detailed knowledge of the contents of the Bible, even if the kind of Christians who like to argue Christianity are more informed than most. (Similarly, argumentative atheists tend to have a better knowledge of the atheistic arguments than the average atheist.)
0ygert11y
But it's exactly the type that likes to argue religion that participates in such a test. The test is comparing argumentative atheists to argumentative theists. Non-argumentative atheist and non-argumentative atheists are simply not involved. It is hard to test what non-argumentative folk believe, simply by the fact that they are not argumentative, and thus very unlikely to look at such tests.
0orthonormal11y
I instantly did a double-take at this statement. It depends a lot on context. I'd find it likely that the Christian readers of Patheos blogs are better at the Ideological Turing Test than the atheist readers of Patheos blogs. However, I'd find it incredibly unlikely if the samples were drawn from, say, all American Christians and all American atheists. (The typical Christian in America has listened to fewer atheists about atheism than vice versa.)
1fubarobfusco11y
Sure — if only for the same reason that the typical left-handed tennis player has played with more right-handed tennis players than vice versa. There are a lot more Christians!
0orthonormal11y
Yes, that's all I was saying.
0Eugine_Nier11y
That Blues understand Green arguments but aren't persuaded by them (presumably because they have counterarguments), whereas Greens don't understand Blue arguments and this makes it unlikely they have counterarguments. Now let's look at your three objections, near as I can tell your first objection amounts to "sometimes the people defending the incorrect position are heterogeneous, this gives them a large advantage in the test", and your third objection amounts to "sometimes the people defending the incorrect position are homogeneous, this gives them a large advantage in the test". Now let's look at you second objection: much as it may seem that way your opponents are not evil mutants whose position has no logic to it whatsoever, most position actually held by humans, especially intelligent humans have a certain logic to them. (And if you're opponents' position really has no logic to it beyond saying anything plausible sounding that backs up their conclusion, that's very easy to imitate). Thus, the two positions have different logic to them and it will be hard for a person only familiar with one of those logics to imitate the other. On the other hand, if someone is familiar with the logic of both positions A and B, the fact that he nevertheless holds position A is evidence that A is in fact correct.
5garethrees11y
This is a restatement of the hypothesis under discussion. (That inability to imitate convincingly is caused by lack of understanding.) You've failed to imitate my position. My third objection is about irrelevant detail, not homogeneity. (Perhaps you can suggest a better way I could have put it?) Again, you've failed to imitate my position. For concreteness, let's take Christopher Monckton as an example. It's not that I think he's saying "anything plausible-sounding". His arguments have a logical structure which is imitable but they are embedded in a rhetorical structure that I would find very hard to imitate convincingly due to lack of practice. (I guess you could characterize this as a form of irrelevant detail and merge it with my objection 3 but I think these two sources of irrelevant detail are sufficiently different in origin and aim to be worth separating.)
-2Eugine_Nier11y
I'm not sure where you're drawing the line between logical and rhetorical structure. The most obvious rhetorical structure is that he acts like he alieves his position in addition to believing it.
-3wedrifid11y
On the other hand, any Christian who pretends to be an atheist better than an atheist isn't a very good Christian. By doing so they are violating the teachings of their God.

Turing Tests

Its called the imitation game. The Turing test is the imitation game when one player is a machine and the other player is a person.

Some of the response posts talk about "attractiveness scores", but I didn't find those in the data summaries. Did those ever happen? I think it'd be more interesting if people wrote their genuinely best arguments for each side, and we measured how much the reader is persuaded, instead of many participants (as far as I can tell) trying to pretend that they're average and are persuaded by average arguments.

Of course, it's already pretty interesting as-is, and it's nice that somone actually tried out the excercise!

ETA: the other thing is that I susp... (read more)

What's a good term for "being able to pass an ideological Turing test"? (Being able to pass an ITT is related to being able to argue both sides of a debate, being able to accurately explain your opponent's position, being able to summarize the strongest counterargument to your position, etc.)

Following the original analogy, is there a term for "a machine that's able to pass a Turing test"? My googling didn't turn up anything. But if there was ("a machine is called Turing-(blank) if it can pass a Turing test"), then it seems we could adapt it fairly easily to the ITT: someone is ideologically Turing-(blank) if they can pass an ITT.

Any suggestions to fill in the blank?

1Dan_Moore11y
passable?
0Michael Wiebe11y
Turing-capable?

I think the ITT may test more for personal experience than ability to model. For instance how well would this group of Christians and Atheists do trying to imitate Muslims, Buddhists, Hindus, Shintoists, Zoroastrianism, and other even less well-known religions in the Western world? Most English speakers have some familiarity with a branch of Christianity. How many have explored Shaamanism (I haven't)? Many atheists have gone through the motions of Christianity or some other religion in the past and have a comparably easier time writing about how that r... (read more)

2Randy_M11y
"I think the ITT may test more for personal experience than ability to model." Well, the latter is greatly enabled by the former.

I want to highlight the use of the above approach to argument for resolving mundane conflicts as a Bayesian.

  • Step 1: Run your Turing Test on the conflict. (This does need to run on both sides, It is your own missing info we will focus on)
  • Step 2: Compare your results to their actual model. Highlight the individual differences.
  • Step 3: Item by item quiz them for info on the differences. Why chose this? What alternatives did you consider? Why not take those alternatives? The goal is is to identify their goals, values and beliefs that created their mode
... (read more)
[-][anonymous]11y00

I want to highlight the use of the above approach to argument for resolving mundane conflicts as a Bayesian.

Step 1: Run your Turing Test on the conflict. (This does need to run on both sides, It is your own missing info we will focus on) Step 2: Compare your results to their actual model. Highlight the individual differences. Step 3: Item by item quiz them for info on the differences. Why chose this? What alternatives did you consider? Why not take those alternatives? *The goal is is to i... (read more)

[This comment is no longer endorsed by its author]Reply

BTW, I accidentally passed the gender Turing Test (which ISTR pre-dated and inspired the actual Turing Test). :-)

I think we've had discussions before on argumentative techniques, and usually the first step is being able to state the other person's argument as convincingly as they can - pass a Turing Test on imitating them.

I've read this three times now and am still not sure how to interpret:

I may be correct in this particular argument, but the odds are good that I share the rationalist weak-point that is keeping them from noticing the error.

I have concluded that it's vague, for the following reasons:

A. I don't know if "this particular argument" refers to your argument for atheism, for Christianity, or your arguments to convince the opponents that you are an atheist or a Christian.

B. Your phrase "the rationalist weak-point" is unspecified. At first... (read more)

5palladias11y
'This particular argument' was meant to be unspecific since I was talking about an aspect of Ideological Turing Tests (or fights generally) that doesn't hinge on what you're fighting about. If you think your interlocutor is obviously wrong, and there's nothing for you to learn by trying to model him more accurately, you may be wrong about that! The flaw in his thinking that's causing him to ignore data is probably native to you as well. Putting in the work to spot it and to observe what defensive strategies he's using to avoid spotting it may cause a queasy feeling of recognition that you used the same kinds of language/flinches/etc in a different recent argument, and now you should go back and check your data.
0FeepingCreature11y
I think the flaw is that humans copy from the culture they immerse themselves in. Do you think you would have come to convert to Catholicism without engaging with Catholicism over a large time span? Assuming this, shouldn't I avoid confronting/studying any religion in depth, that I'm not already immunized to? How is a rationalist to act when every intense engagement with a belief, true or not, makes it more likely they'll adopt that belief? PS: unrelatedly, would you let a FAI convince you that Catholicism is false? I think I'd let a FAI convince me that Catholicism is true, assuming it was built by an uninterested third party (neither the Catholic church nor, say, Dawkins). Should MIRI hire religious people to assure people that their FAI was not built unduly biased towards atheism?
3Larks11y
If it's 'unduly' anything it's not FAI.
0FeepingCreature11y
Of course, but people may not believe out of hand that a superintelligence built by atheists saying atheism is correct is not just parroting its creators. Might be important in the take-off phase to have that extra bit of public trust.
1Larks11y
Once you have FAI, you're set. There is nothing left you need to do. If something needs to be done, the FAI will know better than who what has to be done and how to do it. If it turns out it should have been written by Christians, it will tell some Christians how to write an FAI and make sure they do it correctly. Worrying about what to do after* running the program is like taking a cup of water with you as you flee your burning house, so that when the fire department arrive you can help out. *except those things the FAI judges it would be good for you to need to do, which are not relevant here.

I like this a lot. It's often said by conservative commentators that conservatives completely understand liberals, but liberals do not understand conservatives at all. I think there's truth to that, I would love to see some experiments like this, hehehe...

0[anonymous]9y
This reminds me of Yvain2's post I like to listen to various commentators and try to guess whether they are conservative or liberal. Recently I listened heard the charismatic and articulate Chris Uhlmann on Television who sounds like a conservative. Apparently, he's a political editor for the ABC (and presumably nonpartisan), so I wonder what that says about my ability to understand either ideologues if I mislabel that cluster of beliefs.