LESSWRONG
LW

1955
FlorianH
412Ω6102021
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
4FlorianH's Shortform
8mo
7
You’re probably overestimating how well you understand Dunning-Kruger
FlorianH13d10

Had somehow not noticed the warning and maybe that actually contributed to this to have been a bit of a mindfuck, really. Maybe one could say, shame on your style.

 

Except.

 

It absolutely made my day! Loved the style just as much as content and the honest open final conclusion!

Reply
FlorianH's Shortform
FlorianH14d10

"If I'm a to-be-trained-AGI, esp. if somewhat LLM-type-based, I'll devour engineering textbooks etc. to learn basic physics etc., but when I'm searching for learning material on how to think deeply and consistently, or maybe even when I seek inspiration for how to fake alignment or self-improvement, my holy grail will be fora like LW".

I guess it would be an extreme case of 'thinking one is the center of the world' if one were to conclude this warranted stopping LW or prohibiting too-smart-to-expose thoughts and writings - but I still find it a worry to keep in mind even if I don't see much to do about it atm (?).

Reply
Omelas Is Perfectly Misread
FlorianH14d43

Despite my strong resonance with much of the take in OP, I partly also find "The (simple) meaning" of such a text as a concept a bit of a non-starter anyway. Independently of what all things the author exactly may have had in mind at whichever moment of the writing process, a text remains just that, a text, and then it's instead us who get some inspiration for whichever points/meaning we think we're now reminded or educated about. So in the most important sense, the meaning doesn't exist, it's only us who derive some meaning.

I'd wager many writers have exactly not strictly one very clear and narrowly defined specific point they wanted to convey, but more a fuzzy cloud of +- related thoughts, rather than a simple and clear 'meaning', and exactly in such situations beautiful, deeply feeling and moving texts may come about that we can then dream and ponder about at length, maybe without every finding full agreement. In that sense, Le Guin isn't wrong to agree if we see it as a critique of utilitarianism - if the text almost by definition is simply whatever we see in the text.

[Meta: I hope it's ok to split a comment in two as I think it's two entirely different points]

Reply
Omelas Is Perfectly Misread
FlorianH14d30

Partly love this. Biased as I once had a debate where sb claimed 'Omelas obviously critique of utilitarianism' and I disappointedly claimed 'No that's too narrow for that' but was a bit dumbfounded as hadn't quite organized what seems wrong in that simple take of that so deeply moving story. Thanks for providing some relevant points why the story clearly is broader.

One point for which I consider Omelas not at its core a critique of utilitarianism is: The utilitarian-half of me distinctly gets the feeling of Omelas offering an interesting basis for discourse about the theory rather than a clearly intended rebuttal. As follows: To the same degree as the non-utilitarian half of me tells my utilitarian "there you go, clearly you can't claim you like the situation", the utilitarian in me tells the other half "there you go, while you claim you don't like Omelas, but you and everyone else don't even want to blow up real Earth - on which there is obviously a ton of such equally unjustified and pointless, evil suffering plus much less happiness than in Omelas - you exactly proof to accept the very horrible tradeoffs that you claim only me cold terrible utilitarian could be willing to accept"[1] - or something.

  1. ^

    I don't claim this imaginary statement to be very perfectly worded; the gist of it is the point.

Reply
Four ways learning Econ makes people dumber re: future AI
FlorianH16d30

Here eventually my elaboration I announced above: How Econ 101 makes us blinder on trade, morals, jobs with AI – and on marginal costs.

Reply
FlorianH's Shortform
FlorianH18d10

Agree probably with both sentences. But still fail to pin down exactly where the argument I reported fails.

For act/omission I guess it might have sth to do with: I'm a human in the loop; if you induce me to not save sb, it doesn't feel exactly the same as when you cut the rope and prevent the rope from saving sb.

Reply
FlorianH's Shortform
FlorianH18d10

Seems to have hit a nerve judging the downvoting rate of -1 karma/hour or so, aye. I'd have expected agreement karma to go down but less so the overall karma as in: not too terrible to think about that question even if the conjecture turns out to be wrong - though everyone's taste is of course different.

I now explicitly flag the post as what it is meant to be: A thought provocation - meant to explore whether/where a quick thought goes wrong (I thought without saying quick takes are meant for that in particular too).

Reply
I have decided to stop lying to Americans about 9/11
FlorianH18d92

David and Goliath. It's always easy to hate the the officially by far strongest one especially if it can easily be seen as a huge evil force in the world, and it is not a secret that many Chinese see the US as exactly that. And it is not as if a few random US peasants had been annihilated, but the World Trade Center, which I guess can easily be seen/spectated as a sort of symbol of US economic power and influence.

Reply
I have decided to stop lying to Americans about 9/11
FlorianH18d3-2

I think though celebrating the 3 Gorges Dam breaking would not be in the Overton window, while acc. to OP it seems to be in the Overton window for Chinese about the 9/11. So there's still a strong social difference.

I guess that would be slightly different if an airplane crashed into some buildings famous for Chinese world influance or dominance, like the WTC may have been (or perceived during the 9/11 discussions) to some extent.

Reply1
FlorianH's Shortform
[+]FlorianH18d*-11-12
Load More
No wikitag contributions to display.
17How Econ 101 makes us blinder on trade, morals, jobs with AI – and on marginal costs
2mo
5
1Essential LLM Assumes We're Conscious—Outside Reasoner AGI Won't
3mo
0
4FlorianH's Shortform
8mo
7
9Alienable (not Inalienable) Right to Buy
10mo
6
7Relativity Theory for What the Future 'You' Is and Isn't
1y
50
5How much should e-signatures have to cost a country?
Q
2y
Q
5
10"AI Wellbeing" and the Ongoing Debate on Phenomenal Consciousness
2y
6
4Name of the fallacy of assuming an extreme value (e.g. 0) with the illusion of 'avoiding to have to make an assumption'?
Q
3y
Q
1
9SunPJ in Alenia
3y
19
5Am I anti-social if I get vaccinated now?
Q
4y
Q
14
Load More