MondSemmel

Wiki Contributions

Comments

Recommendation for everyone who reported bugs in this comment thread: Bug reports are quicker to do, and much easier to understand for devs, if you accompany them with screenshots or gifs of the bug behavior. There are a bunch of screen capture tools which can very easily record a gif and upload it to the internet.

For Windows I've recommended ShareX here before; whereas for MacOS I've heard that e.g. CleanShot might be able to do this (though I haven't tried that myself). I'm not aware of any Linux versions, but they'll surely exist, too. Plus nowadays I suspect that even the inbuilt screen capture tools on each OS can record gifs, and even if they can't automatically upload them, you can instead just drag the final gif into the LW comment.

I also frequently make typo comments, and this problem is why I've begun neutral-voting my own typo comments, so they start on 0 karma. If others upvote them, the problem is that the upvote is meant to say "thanks for reporting this problem", but it also means "I think more people should see this". And once the typo is fixed, the comment is suddenly pointless, but still being promoted to others to see.

Alternatively, I think a site norm would be good where post authors are allowed and encouraged to just delete resolved typo comments and threads. I don't know, however, if that would also delete the karma points the user has gained via reporting the typos. And it might feel discouraging for the typo reporters, knowing that their contribution is suddenly "erased" as if it had never happened.

A technical alternative would be an archival feature, where you or a post author can mark a comment as archived to indicate that it's no longer relevant. Once archived, a comment is either moved to some separate comments tab, or auto-collapsed and sorted below all other comments, or something.

You may be right regarding what new users care about - usually one registers on a site to comment on a discussion, for example -, but the problem is that from that perspective, LW is definitely about AI, no matter what the New User's Guide or the mods or the long-term users say. After all, AI-related news is the primary reason behind the increased influx of new users to LW, so those users are presumably here for AI content.

One way in which the guide and mod team try to counteract that impression is by showing new users curated stuff from the archives, but it might also be warranted to further deemphasize the feed.

All the typo comments are great, but the resolved typos are mixed in with open feedback. Is it possible to hide those or bundle them together, somehow, so they don't clutter the comments here?

A few points.

  1. This might be conflating "what this site is about" with "what is currently discussed". The way I see it, LW is primarily its humungous and curated archives, and only secondarily or tertiarily its feed. The New User experience includes stuff like the Sequence Highlights, for example. If there's too much AI content for someone's taste (there certainly is for mine), then a simple solution is to a) focus on the enduring archives, rather than the ephemeral feed; and b) to further downweight the AI tag (-25 karma is nowhere near enough).
    1. That said, it might be warranted for the LW team to adjust the default tag weights for new users, going forward.
  2. Rationality is closely related to cognition and intelligence, so I don't think it's as far or distinct from AI as would be implied by your comment. AI features prominently in the original Sequences, for example.
  3. You registered in 2020. Back then, a new user might have asked whether the site is supposed to be about rationality, or rather about Covid.

What would a better debate look like? Below is a speculative list of ways to structure a debate so that it finds and explains more truth.

You've noticed why debates suck for truth-seeking, but we seem to use them anyway. So I'm suspicious that the main purpose of debates isn't actually truth-seeking (<-> Robin Hanson's "X is not about Y"), but rather something else, like status contents.

In which case it's all well and good to propose conversation schemas which are better for truth-seeking, but I'd rather call them by some new term, rather than "debates".

Ah. I had been wondering what the actionable implications of this model were supposed to be. After all, it does not seem useful for truth-seeking to adopt a strategy of assuming that in every blank spot in your territory, there's actually an invisible dragon. With this comment, things make more sense to me.

That said, if shouting is fine, then the Dark Forest analogy seems misleading, and another metaphor (like DirectedEvolution's Faerie Forest) might be more apt.

You've made a bunch of great comments in this thread. Have you considered turning them into a top-level post on LW and/or the EA forum? You've already done the laborious part of writing all this stuff up, after all. From my perspective, the only things missing to turn them into a post would be to add a bunch of headings, plus maybe an intro paragraph, and to address a general audience rather than jasoncrawford specifically.

This logic can be taken too far - I don't see the point of feeling constantly anxious -, but at least on an intellectual level, I think it does make a certain amount of sense. It's hard to notice the insanity or inadequacy of the world until it affects you personally. Some examples of this:

  • People buy insurance to be safe from <disaster>, but insurance companies often don't want to pay out. So when you buy insurance, you might incorrectly feel safe, but only notice that you weren't if a disaster actually happens.
  • If you've never been ill, then it's easy to believe that if you got ill, you could just go to the doctor and be healed. Sometimes things do work that way. At other times, you might learn that reality is more complicated, and civilization less competent, than previously thought.
  • I think the Covid pandemic, and the (worldwide!) inadequate policy response, should've been at least a bit traumatizing to every person on this planet. Not necessarily on an emotional level, but certainly on an intellectual level. There's a kind of trust one can only have in institutions one knows ~nothing about (related: Gell-Mann amnesia), and the pandemic is the kind of event that should've deservedly broken this kind of trust.
Load More