Thank you for your cooperation and understanding. Don't worry, there won't be future posts like this, so you don't have to delete my LessWrong account, and anyway I could make another, and another.

But since you've dared to read this far:

Credibility. Should you maximize it, or minimize it? Have I made an error?


Don't be shallow, don't just consider the obvious points. Consider that I've thought about this for many, many hours, and that you don't have any privileged information. Whence our disagreement, if one exists?

New Comment
307 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I would normally visit even a Score:-22 post with 200+ comments, because I've found that such cases indicate a particularly awful post may be worth opening just to hunt for a few of the most excellent clarifications or rebuttals it elicited.

A warning to others: my heuristic was wrong in this case. Few comments here even hint at what the hell is going on, and those suggested nothing more interesting than some extremely unlikely theological or parapsychological beliefs that Will might have latched onto and desired to "protect" us from. You could find more interesting and plausible basilisks in Lovecraft's stories or Stross' Laundry novels.

Thanks for the info. I opened this post for the same reason as you, and now that I've read this I'm going to close it.
I'm so stupid: I read your comment, I saw the comment karma, I saw the article karma, but I read the discussion anyway. Now I will never get those 20 minutes of my life back, and if I happen to die exactly 20 minutes before Omega invents immortality, it is all my own stupid fault. For people like me, this is what this whole article is about: Will Newsome trying to be interesting, without ever offering anything of substance. If you read the comments trying to find more information, there isn't any!

Now I will never get those 20 minutes of my life back, and if I happen to die exactly 20 minutes before Omega invents immortality, it is all my own stupid fault.

Not wasting the 20 minutes wouldn't have helped you survive till Omega invented immortality. (You didn't shorten your life in an absolute temporal sense, you just wasted some of the middle.)

Not if he was the major ingredient in inventing immortality...




Must have had a good reason. It's a pity we mere mortals cannot fathom that reason, but we should at least recognise that it's the reasoning of God and so our being unable to fathom it is a fault with our meat brains, not with the reasoning.

At least be more entertaining. This post is boring. And you exist for nothing but to make me chuckle at your quaint ideas.
How comes both the parent and the grandparent are upvoted this much?
Because they (and one more ancestor above) are awesome, lighthearted, and playfully satirical. I would upvote them again.
Is for trying to be funny and intellectually disciplined in same time, that Will stay here asking for a pool about our mental model of him.
Because God is a troll, that's why.

Don't be shallow, don't just consider the obvious points. Consider that I've thought about this for many, many hours, and that you don't have any privileged information.

So? You say crazy (and wrong) shit a lot and have no credibility.

Whence our disagreement, if one exists?

Try explaining your reasoning and we might see. The whole "I have mysterious reasons why this crazy idea is true" thing is just annoying. (Whether done by you or Eliezer.)

I don't know about "no credibility", Will knew some.

Anyone finding themselves in the awkward position of wondering if he is a child among adults who may or may not be using innuendo? And that you think you understand a few of them, but aren't sure you do? To summarize my current state, Will Newsome is hitting some of my "take him seriously" heuristics pretty hard. At their center lies that he is taken far more seriously than most average posters think he should be taken, by some pretty big names who have been with this Eliezer centred rationality project since its start and have accrued quite a reputation for excellent judgement. He has also been a visiting fellow at the SI, which means obvious crackpottery should have been filtered.

I have several theories on this which I have constructed over the past few months but don't feel comfortable sharing right here, because I've stumbled on several caches of dangerous thinking. I have to keep squishing some ugh fields and bolstering others when exploring these ideas. Yet I also just can't come up and ask the right people to check my reasoning on any of them, their time is valuable and I'm not in their social circles anyway. I find myself blinking in confusion unsure if I'm bei... (read more)

He has also been a visiting fellow at the SI, which means obvious crackpottery should have been filtered.

To defend the repute of the visiting fellows program, please note that his crackpot score has skyrocketed since that time and he would almost certainly not have been accepted had he applied then as he is today.

Also, I think his crackpot score skyrocketed mostly after he left - so if it was something we did, it was a delayed effect.

Also worth noting is that I was made a Fellow sort of off the cuff without any real input from anyone in the organization. Anna's absence led to much disorganization in the program. And yes, when I first volunteered I was more or less a typical LWer, with one strange thing being my high school drop out status.

I get the impression that he's often more concerned with signaling interestingness, intelligence, and contrarianism than figuring out what's true.

Note: I also get that impression from Michael Vassar. But I have lots of respect for the current Singularity Institute director.

I don't get that impression from Michael Vassar, possibly because I've talked with him in person. Asking repeatedly for examples makes it fairly possible to find out what he means.

I have no such hope with Will Newsome.

I've talked with Michael Vassar in person, and also found him much more comprehensible than by brief cryptic textual snippet. Have you talked with Will Newsome in person? I haven't, but every time I engage him personally in comments, etc., his vagueness resolves into something a lot more coherent (even if it's not something I necessarily agree with).

Will is pretty weird and I don't believe the way he thinks is, ya' know, normative. But I still find his writing to be extremely valuable relative to most Less Wrong commenters because for the most part Less Wrong commenters come in three different flavors: vanilla (what I would say if I weren't as smart or 3-4 years less educated), chocolate (what I would say now) and strawberry (what I would say if I were smarter or 3-4 years more educated). Will is lobster ice cream with rainbow jimmies. I will never think like him and I wouldn't want to. But I'm glad there is someone extending hypotheses as far as they will go and never looking back. I find novel explorations of hypothesis space to be both useful and interesting. He is pursuing a train of thought I don't have a lot of time for and no reason to prioritize. But I'm still looking forward to finding out where it ends up.

Will is like a musician on a hallucinogen. You wouldn't want to have his brain and you probably don't trust his judgment. But before he burns out at 27 he's gonna produce some really interesting ideas, some of which will simply be absurd but a few of which might have real staying power and influence a generation.

What is your position on Will Newsome?

I frequently find Will's contributions obscurantist.

In general, I find obscurantism at best tedious, and more often actively upsetting, so I mostly ignore it when I encounter it. Occasionally I engage with it, in a spirit of personal social training.

That said, I accept that one reader's obscurantism is another reader's appropriate level of indirection. If it's valuable to other people, great... the cost to me is low.

At this point the complaining about it by various frustrated people has cost me more than the behavior itself, by about an order of magnitude.

The same word came to mind, and it's common to his history of interactions, so seeing it here means I ascribe it to him rather than the logic of whatever underlying purpose he may have on this occasion.

I didn't meet Will until April 2011, but most people who have been around longer seem to share Carl's opinion. For myself, I also find many of Will's contributions obscurantist, and I agree with John Maxwell that they seem to want to signal interestingness, intelligence, and and contrarianism. Finally: Will offered good, substantive feedback on two of my papers.


My sensation about Will Newsome is that of a celebrity I haven't heard of. Most of the comments that I notice authored by Will Newsome appear to be about Will Newsome, but I don't understand their content beyond that. They seem to attract a lot of attention.

There is this strange current of, well insight and reasonableness in his comment history and ideas.

I would be interested in reading some of these ideas, if you could point some out.

In addition to already mentioned obscurantist tendencies, he awards himself intellectual credit for "going meta," even when this does not lead to actually smarter behavior or better results.


He has also been a visiting fellow at the SI, which means obvious crackpottery should have been filtered.

What did he actually do, though?

For half the time, with Anna, I was an intern, not a Fellow. During that time I did a lot of intern stuff like driving people around. Part of my job was to befriend people and make the atmosphere more cohesive. Sometimes I planned dinners and trips but I wasn't very good at that. I was very charismatic and increasingly smart, and most importantly I was cheap. I was less cheap as a Fellow in the Berkeley apartments and accomplished less. I wrote and helped people occassionally. There weren't clear expectations for Fellows. Also people like Eliezer, who had power, never asked for any signs of accomplishment. Eliezer is also very bad at reading. Nonetheless I think I should have accomplished more somehow, e.g. gotten experience writing papers from scratch.

I believe I almost always turned down credit for contributions to papers, but I didn't make too many substantive contributions; I did a fair bit of editing, which I'm good at.

You could get a decent idea by looking at what the average Visiting Fellow did, then remember that I often couldn't remember things I did -- cognitive quirk -- and that I tried to avoid credit when possible at least half the time.




Part of my job was to befriend people and make the atmosphere more cohesive.

You were good at that, as I recall. As was (especially) Alicorn. Also, at the time I thought it was just super-cool that SI had its mundane tasks done by such brilliant people.

Thanks for the summary.
That's interesting. I also have something like that. It extends to not being able to remember names, and not being able to easily come up with specific examples. Is it like that for you?
Yes, also for Eliezer.
Do you know of any helpful strategies for dealing with this or get better?
For Eliezer & I it seems there's also the matter of not being able to find objects amongst other objects. Eliezer hasn't quite said he's bad at that but I surmised it from one of his most terrible posts, ha. For that issue, I've learned to just use explicit, conscious linear search. Still terrible, but not as terrible. With episodic memory I suspect there are similar strategies for looking through mental objects, likely in temporal order. Potentially similarly with names. I can't think of anything that would work for specific examples in general though, which as you know is really quite a big problem during arguments and so on. I mildly suspect the problem has somewhat to do with damage to or atrophy of the dorsolateral prefrontal cortex. But that's speculation, and there are a lot of selection effects on who shows up on LessWrong, so it might be a somewhat rare combination of stuff. Eliezer would know a lot more about the neurology and so on but he's probably not available for questioning and speculation on the matter. For what it's worth I'm somewhat schizotypal/schizoaffective, and Eliezer also seems to lean that way.
It may or may not be relevant, but finding objects amongst other objects was one of the functions that was severely degraded by my stroke. As with most other damaged functions, I found that actually forcing myself to do it anyway (which usually required first learning a new way to frame the doing of it) led to very rapid improvement back to more-or-less baseline. The improvement plateaued out thereafter. (Unsurprisingly, but disappointingly. The experience of such rapid improvement is very heady stuff.)
If you don't mind sharing, what parts of the brain or other cognitive functions were most damaged by the stroke? I've pieced together some of the story but not much.

The aneurysm itself was at the edge of my thalamus. The resulting swelling caused damage kind of all over the place.

The functional damage at first was pretty severe, but I don't remember specifics; I mostly don't remember that week at all and much of what I do remember I'm fairly certain didn't actually happen. I spent it in an ICU. I come back to a coherent narrative about a week later; at that point, the bulk of the functional damage was general pain and fatigue, right-side hemiplegia (my right arm and leg were not quite paralyzed, but I lost control over them), mild aphasia which most often manifested as anomia (difficulty retrieving words) and occasionally in other ways (I remember losing the ability to conjugate sentences for a few hours; that was freaky), and (most significantly) a loss of short-term memory with all the associated knock-on effects to various kinds of cognitive processing.

There was also a lot of concern at various points that there may have been damage to my emotional centers. I never noticed any such effect, but, well, I wasn't necessarily the most reliable witness. Most memorably, this led to one doctor asking me if I my emotional state was at all unusual. ... (read more)

You want answers?
Writes the most consistently fun posts out of anybody here.
Maybe it's a deliberate puzzle set up as an intelligence test for recruiting purposes.
I'm sad that Will doesn't seem to care about being correct, because I can imagine how much he could contribute if he cared.
I think that Will (his Will-like stuff, not the "respectable" comments) is 60% worth taking seriously. But hell, I take Philip K. Dick 85% seriously, so what do I know. (That is, I'm not a sane person myself, never claimed to be, so you'd be wise to discount the crazy shit I might say on these topics even if you find it interesting.)

For what it's worth, it's already my opinion that you're completely insane and ought to have no credibility whatsoever. In fact I'm confused that anyone takes you seriously at all.

What's the big scary secret?

This is mainly what I want to know. From the comments on this post, it looks like W_N claims to have (read: geniunely has, geniunely thinks he has, or trolls as though he has) come across something he can't tell people about - a basilisk, some conspiracy-theory-type information, something. Being a relative newcomer unwilling to go through large numbers of his previous posts, I'd like to know if anyone who's seen him longer has any more information.

Also, this whole thing is absolutely hilarious to read.

I have a few ideas:

1) It's a "basilisk", i.e. an imaginary lovecraftian threat that doesn't even make sense outside of some highly particular and probably wrong belief system. (That's not my definition of basilisk, but it is what I think of such claims.)

2) Some mundane fact about the difficulty or danger of actually trying to save the world (in the specific sense of shaping a singularity) has made his blood run cold. It could be the existence in the real world of powerful evil cliques; it could be the psychological potential of joining them, or just of becoming selfish in an ordinary sense.

3) I remember when I was 22 and realized (according to the plans I had at the time) that it might take me eight years to save the world. That was very daunting, because at the time it looked like it would be a joyless, stressful, solitary existence, for an unimaginably long period of time. And as it turned out, I didn't even get it done... Will could be fleeing the responsibilities of his "position" - I mean his existential position, which surely includes the perception that he has the potential to make a difference in a huge way.

ETA 4) He wants to create a barrier (a "credibility barrier") between himself and his former associates in SI, so as to develop his own thinking, because there's a systematic deficiency in their outlook and he must avoid the temptation of working within that paradigm.

I mean, I get why Newsome would want to obscure this: a lot of people get off on being seen as "mysterious" or whatever. But there does seem to be a number of people here who understand what is going on, but are refusing to offer their explanations, in spite of the fact that a lot of people are confused here. Maybe they take the basilisk threat seriously? That would be crazy/sad if true. Edit: Also, there are now a number of people openly asking for explanations, but all we are getting is speculation from people who also don't know what is going on. I'm starting to get annoyed with this.
Just don't be fooled by intelligence too much. Just because those people can disgorge some math that doesn't lent their extraordinary claims much credence. Most of the credence they assign is based on mutual reassurance anyway. Just like a bunch of ufologists updating on each others evidence of alien abductions.
Given normal assumptions, additional claims of abductions should provide additional evidence. I don't think you've quite pinned downed the error with your example.

Can you explain clearly why you have gone all crazy? Why do you have to drop these esoteric hints and do this stupid troll business?

My understanding is that you delved too deeply into simulation arguments and met Cthulu or something, had a religious experience and determined that there is a god or something and that the people who are in the know are all in the catholic church somewhere.

And then for some reason you can't just explain this clearly and lay out your reasons. Or maybe you've tried explaining it clearly, but that was before my time and now you assume that everyone either already knows what you are on about, or is interested enough to scour the internet for your posting.


If Will won't cooperate, can someone else explain the best model we have of his weirdness?

It may be relevant that Will has talked elsewhere about certain important physical phenomena being evasive, in the sense that their likelihood of occurring is significantly inversely proportional to whether someone is trying to prove or demonstrate them.

When I value my interactions with an evasive phenomenon (the beliefs of shy people, the social rules of Guess cultures, etc.), one consequence is often that I can't actually talk about my real reasons for things; everything has to be indirect and roundabout and sometimes actively deceptive.

I am generally happier when I don't value my interactions with evasive phenomena, but that's not always an option.

Upvoted for giving two examples of real evasive phenomena. I'd previously only encountered that idea in anti-epistemological contexts, wherein "the universe evades attempts to seek the truth about X" was always clearly a desperate after-the-fact attempt to justify "so despite attempts to seek the truth about X which keep appearing to contradict my claims, you should still believe my claims instead".

But I suppose it's just common sense that you can't properly investigate much psychology or sociology unless you avoid letting the subjects understand that they're being investigated. That's a huge difference from e.g. evasive cosmologies, in which investigating a subject without alerting Him is often presumed impossible.

Well, evasive physical law follows from certain theologies just as readily as evasive cultural norms or relationship rules follow from certain sociologies and psychologies; it needn't be post-hoc reasoning. Of course, whether those theologies, or any theologies, have a referent in the first place is a different question.
Evasive physical law follows naturally from some theologies, it's merely been a post-hoc rationalization for the theologies that I've seen people trying to spread. For instance, either of "We have an ethical theory under which God needs to hide" and "We claim to have records of many instances in which God avoided hiding" could be a weak but positive argument by itself, but the (common) combination is actually negative evidence.

Do you need a hug?

I need a hug. Edit: Thanks for all the hugs!
Internet Hug Protocol v0.1 INTERNET HUG!!! (in 0.2, the message might be a vivid near-mode description of a hug)
Can I get a hug? I've just been sick for a week or so and it's making me all fuzzy-headed, and I hate being fuzzy-headed.
You do. hugs! ^_^
hugs paper-machine
hugs paper-machine
hugs! ^_^
hugs konkvistador
Not really. If one was offered I'd accept it.
I just realized that I can't take your response as evidence about whether you actually need a hug.
Why not? I don't see why. My girlfriend is five feet away, she can give me a hug. I'll ask her to if that would make you feel better.
I need a hug.
/me hugs will as well.

I endorse maximizing the degree to which people consider my saying X is true to be evidence that I believe X is true.
I don't worry too much about the degree to which people consider my belief that X is true to be evidence that X is true. I expect that depends a whole lot on specifics of X.
I resent questions that wrap themselves in framings like "don't just consider the obvious points."
I endorse you having private conversations with the folks you consider worthy, rather than having obscure public ones like this. The rest of us might not have whatever attributes would allow us to penetrate the cloud of obfuscation and thereby receive your insights, but that doesn't mean we deserve to have our time wasted.

I have found Will_Newsome to be annoying for a long time now, because he does things like this and because he strikes me as irational. But he used to get upvoted, so I figured he just rubbed me the wrong way and didn't talk to/about him to avoid conflict. Now other people are downvoting him too. What changed?

Retracted because I have come to understand things that made the question moot, and because I no longer find Will as annoying as I did. I no longer think he's acting out of malice, though I still have serious doubts about his rationality.

If your goal is to lower your credibility, why do that in the context of talking about credibility?

Don't feed the trolls. It's sad that needs to be said on LessWrong, but it does.


Separate comment: Some of your remarks like this look almost like you are engaging in intellectual exhibitionism. This one fits into that and is a potential source of irritation.

Now to more substantially answer the question: people should pay attention to my idea and thoughts exactly how much they are credible or not. Trying to deliberately modify how credible I am in a general context will interfere with people make their most informed decisions about whether or not to listen to anything I have to say.


Great post: I like your style. The first observation to make is that individuals who make extraordinary contributions are often extremely eccentric, and also the quality of their pronouncements usually has high variance. So you've succeeded in increasing my probability estimate that you will say something very worthwhile, though maybe at the price of decreasing the (my) expected value of your average statement.

Presumably you're the Burfoot who wrote or is writing a book on compression as fundamental to epistemology?
Yes, a draft version is done already, you can find it on arXiv if you are interested. I'm not sure I would say the argument of the book is that "compression is fundamental to epistemology", it's more along the lines of "the problem of building specialized lossless data compressors is a deep and interesting one; if we attack it we will probably find out a lot of interesting stuff along the way".
Insightful. Will has endorsed "up the variance!" in as many words, but I hadn't made the connection that explicitly maximizing variance like could be a strategy.

I don't get it. I'm guessing that Will edited the post? And it had something to do with the simulation argument?

Edit: I forgot to include, if someone who knows him better could explain will_newsome's motivations here, that would be appreciated. (I enjoy internet drama).

Why is Will Newsome doing this? My model of him just broke.

Because we have to down vote him.
But he didn't do anything wrong before this.

Because he's the hero LessWrong deserves, but not the one it needs right now. So we'll hunt him. Because he can take it. Because he's not our hero. He's a silent guardian, a watchful protector. A dark knight.

Oh my God. This post was worth it just for the hilarity.
I'm going with this commenter being Will. What do I win?
I've had enough of your snide insinuations. Gains Renegade Points

Credibility. Should you maximize it, or minimize it? Have I made an error?

Depends entirely on your goals.


Most people drop out before doctorates; it's something like 97-99% of the US population. And getting a doctorate in many fields is a terrible idea these days: I looked very hard at continuing on for a doctorate in philosophy, and concluded that even if the grad school offered a fullride, it was still probably a bad idea and almost anything was better.

seems a distinguishing mark of the core SIAI community

Your examples being Will and Eliezer? I didn't realize the core SIAI community was so small.

Is SIAI to serve as poster boy for the libertarian cause

... (read more)

Will, who knows a bit about psychiatry, frequently informs us that he has suffered from schizophrenia.

Makes allusions in that direction.

Paranoid schizophrenia (the most likely form because Will is high functioning) is incurable--although partial remissions often occur.

Incurable but fortunately treatable to a significant degree---especially the highly visible paranoid side of things. Unfortunately those with the negative symptoms are pretty much just screwed.

Will often posts in the obscure, mysterious fashion often typical of intelligent paranoid s

... (read more)

What do you mean by "credibility"?


Please consider undergoing neurofeedback therapy. I'm doing it and I believe there is a reasonable chance it would yield you (far more than the average human) a high benefit.

Let me take a guess:

You believe in some form of Christianity and enjoy discussing it on LessWrong but think that your comments harm the perception of Christianity on LessWrong due to readers not having privileged information.

You believe you can mitigate this negative effect by lowering your own reputation.

This is a poll. Is Will Newsome sufficiently noisy (in both senses of the word) that mod intervention is called for? Permalink to karma sink.

This poll is BROKEN! Abandon it and do it properly!

The Karma sink comment is brilliant (and harmless fun) but the extra comments on the "Yes" and "No" answers don't just bias perception they outright make the poll unanswerable in the current form.

No. He's entertaining even when at his trolliest.

I would vote for a plain "No." but he is most decidedly not entertaining even when at his trolliest. He is boring, repetitive and banal when at his trolliest. It shouldn't be assumed that people who oppose mod influence believe Will's trolliest crap is entertaining - or vice versa.

I'll agree wtih all of that. I couldn't figure out how to vote in this poll on seeing this comment (and I am not an idiot or a newbie). I don't read Will much and I imagine this little jaunt of his says a lot more about Will than about other parts of the world that I am interested in. I don't KNOW that that is the case, but I don't assign a high enough probability to taking value from figuring it out to go about reading all his posts.

No. He's entertaining even when at his trolliest.

Especially when at his trolliest.

Yes. Please quiet the madness.


Because Will had explicitly threatened to use sockpuppets for various purposes, he could have used them to manipulate the poll, too. Therefore I vote by means of this comment. The vote: ban him. Reasons:

  1. I find nothing entertaining in trolling or intentional obscurity, it's pure noise.
  2. WN's behaviour threatens the credibility of others who engage him. (There isn't much left of his own.) And he's good at attracting attention.
  3. Not banning him would help to establish a norm that trolling and other uncooperative conduct is accepted here.
  4. First and foremost, I want LW be a haven of sanity in the stormy waters of the internet. Please don't let seemingly sophisticated nonsense enter with a pretext of entertainment. I am afraid he could attract similarly crazy people; one Newsome is manageable, but ten of them would seriously damage the site.

By the way, this is the first time I endorse banning someone from an internet discussion forum.

For what it's worth, I didn't, and I've never done similarly. I have three sockpuppets. One is a joke account I've never used. I made it recently. The other has my identity attached to it already—I've made about five comments with it. And the third is for completely anonymous comments. I rarely use the second or the third, and I never use them for voting. I also haven't voted on the poll with this account, and I only voted on one comment on this post. In general I just don't vote much, mostly because I forget about the option.
I disagree with a ban.
Which is a reason to treat me nicely—it's not hard to multiply myself by ten. Luckily, I'm the only Will Newsome in the world currently, so I don't think you have much to worry about.
Wouldn't being banned help you with your goal of reducing your credibility?
Yes, and I'm sort of okay with being banned, but I'd like a month's warning. During that month I'd make sure I'd deleted and edited various comments and so on. But I haven't thought through the question of banning carefully enough, and banning is hard to reverse.
As long as you aren't producing too much noise in the 30-day period, I don't see why the mods wouldn't grant this request. A temporary ban could be another option worth considering.
There might also be a clever software solution. I know Louie who works with the code base. If I write up some Python they might implement it. Something that automatically hides or collapses my contributions for people who haven't voted on my stuff an people who have more downvotes than upvotes. The same code could be used in future similar situations.
Wei Dai's Power Reader script has features along these lines that I find useful during those brief periods when troll-feeding takes over the recent-comments list. Of course, the automatic part is important, admittedly. For my own part, I don't find your contributions less useful than the median.
Of course anybody with an ounce of self control can simply avoid a thread they don't want to read anymore. Motley Fool has an "ignore" feature to ignore the posts/comments of a particular user. I actually would not like to see that here. I'd rather have moderation. Even with the ignore feature, you still wind up seeing a lot of stuff related to the stuff you are ignoring as OTHER people quote it and comment on it. Of course Motley Fool boards aren't as tree like as this group. But since this is so tree like, all I need to do is leave a particular discussion and never click on it again, I don't need you or Louie to Python me into not realizing that that is what I am doing.
Yeah, and I wouldn't sockpuppetly cause disruption during such a ban.

It depends what mod intervention consists of. If you mean banning him, I do not think that is called for at this time. If you mean telling him to stop his antics and warning him that he's headed towards a ban if he continues, that sounds like a good idea. Posts (and comments) that are intentionally obscure, made merely for one's own entertainment, or otherwise trollish are not welcome here, and since the community's downvotes and critical comments haven't gotten through to him it would be good to have a mod convey that message.

What do you mean "haven't gotten through to me" in this case? You mean, haven't successfully deterred me? Because clearly I understand them and their significance, and additional measures, like a warning, wouldn't change that fact—it'd just make me more antagonistic.

CLARIFICATION: I do not have ACCOUNT DELETION powers. As far as I know, those powers don't exist. I have comment/post banning powers and post editing powers. If I started moderating Will, I would be banning excess downvoted comments, not shooing him away wholesale.

(Thanks for making the clarification. I was very worried.)

I'm in favor of mod intervention lest anyone else waste as much time as I have scratching their head trying to figure out what this thread is about.

I can't decide on a poll option, so here's my opinion: I don't want to see a lot more of Will_Newsome's trolling; I think it damages the site. But just banning him feels like leaving a fascinating mystery unresolved. I want to understand Will's motives, and his insights about simulation, and whatever scary idea he came up with. If there's some way to talk this out in good faith, let's try to do that first. But banning is preferable to endless obfuscated confusion.
Making moderation decisions based on a poll is a horrible idea.
Maybe. But moderation isn't a democracy.

Yes, but moderation is about making the site what it should be for a variety of people, not just me and people who are unshy enough to talk to me directly, or just mods. So I want information. I wield the ban button, but I'm not going to use it as a site customization tool for Alicorn in particular.

I would rather see it used as a site customization tool for Alicorn than see it not used in instances like this.
Might I suggest consulting our benevolent dictator as well?
On the other hand dictators and tyrants who do stuff people particularly don't like get killed.
On the gripping hand, as far as I can tell you're not particularly taken with the idea of this moderator poll either. So why the appeal to emotion?

Will, who knows a bit about psychiatry, frequently informs us that he has suffered from schizophrenia.


I'm schizotypal I suppose, but not schizophrenic given the standard definition. I don't think I have any trouble interpreting mundane coincidences as mundane.

Will Newsome, we are both schizotypal. We might have a thing or two to discuss.

Here's one example. (But one example is not enough for you to be expected to discover it.)

Well, the first hit for 'will newsome' in a Google site search for me is... This post. I don't see how you could read this post or its comments without concluding that LWers are pretty ambivalent about him, even excluding the most recent comments on this post.

On Will_Newsome's profile, one sees a link to his blog, Computational Theology, where it is possible to have an idea of how he thinks, or what kind of reasoning is behind this whole stuff. I wasn't impressed, although I would not be able to do better myself (at least at this point).

I was mightily impressed by the last post on his last blog, which he now disavows and outright despises. But I thought he had some really interesting ways of looking at the personhood problem.

The comments on this post have significantly influenced my opinion on a number of people. Thanks, Will.

Someone who proclaims to openly sacrifice their credibility, in a mysterious way, while making a lot of vague suggestions, can succeed to cause people to actually listen and speculate if there might actually be more to it than meets the eye.

Something else that has the same effect is censorship and secrecy.

What also works well is to claim that there exists some technical research but that it has to be kept secret.

What all of it has in common is that there is nothing but a balloon full of hot air.

Consider that I've thought about this for many, many hours, and that you don't have any privileged information. Whence our disagreement, if one exists?

Disagreement about what? What's exactly your opinion on credibility?

Whence our disagreement, if one exists?

Credibility moves, like status moves, cannot be self-recognising and still effective. I believe this, you don't, there's our disagreement.

I'm not sure I agree with this either. You can't make a self-recognising move to gain credibility that is effective? Since it should be impossible to (predictably) make a sequence of difficult claims to a greater degree of impressive reliability than previous credibility judgements would allow. This demonstrates that you have at least the capability to give trustworthy utterances. This gives you more credibility when there is a reason to expect you to be attempting to make good claims.
Credibility moves can easily be self-recognizing and still effective. I say that you shouldn't believe me and I will say a lot of meaningless things - it raises the probability that my claims are just playacting.
[This comment is no longer endorsed by its author]Reply

If somebody meta-farts in a forest and nobody cares, was it still rude to do so?