All of Autolykos's Comments + Replies

It's probably one of the many useful functions of the court jester :)

1Matt Vincent1y
It's useful until the jester gains a reputation as someone whose views shouldn't be taken seriously, at which point the jester's dissent may begin to have the opposite effect.

Even a more sane and more continuously distributed measure could yield that result, depending on how you fit the scale. If you measure the likelihood of making a mistake (so zero would be a perfect driver, and one a rabid lemur), I expect the distribution to be hella skewed. Most people drive in a sane way most of the time. But it's the few reckless idiots you remember - and so does every single one of the thousand other drivers who had the misfortune to encounter them. It would not surprise me if driving mistakes followed more-or-less a Pareto distribution.

There probably was a time when killing Hitler had a significant chance of ending the war by enabling peace talks (allowing some high-ranking German generals/politicians to seize power while plausibly denying having wanted this outcome). The window might have been short, and probably a bit after '42, though. I'd guess any time between the Battle of Stalingrad (where Germany stopped winning) and the Battle of Kursk (which made Soviet victory inevitable) should've worked - everyone involved should rationally prefer white peace to the very real possibility of a bloody stalemate. Before, Germany would not accept. Afterwards, the Soviets wouldn't.

1AlexanderRM8y
It's also worth noting that "I would set off a bomb if it would avert or shorten the Holocaust even if it would kill a bunch of babies" would still answer the question... ...or maybe it wouldn't, because the whole point of the question is that you might be wrong that it would end the war. See for comparison "I would set off a bomb and kill a bunch of innocent Americans if it would end American imperialism", which has a surprising tendency to not end American imperialism and in fact make it worse. Overall I think if everyone followed a heuristic of "never kill babies", the world would be better on average. However you could get a problem if only the carefully moral people follow that rule and the less-careful don't and end up winning. For a consequentialist, a good rule would be "any ethical injunction which causes itself to be defeated cannot be used". At the very least, the heuristic of "don't violate Geneva Convention-like agreements restricting war to make it less horrible which the other side has stuck to" seems reasonable, although it's less clear for cases like where a few enemy soldiers individually violate it, or where being the first to violate it gives a major advantage and you're worried the other side might do so.

Yup. Layer 8 issues are a lot harder to prevent than even Layer 1 issues :)

While air gaps are probably the closest thing to actual computer security I can imagine, even that didn't work out so well for the guys at Natanz... And once you have systems on both sides of the air gap infected, you can even use esoteric techniques like ultrasound from the internal speaker to open up a low bandwith connection to the outside.

5DanArmak8y
Even if you don't have systems on the far side of the air gap infected, you can still e.g. steal private keys from their CPUs by analyzing EM or acoustic leakage. So in addition to an air gap, you need an EM gap (Faraday cage) and an acoustic gap (a room soundproofed for all the right frequencies). In general, any physical channel that leaks information might be exploited.

And some people would like to make it sit down and write "I will not conjure up what I can't control" a thousand times for this. But I, for one, welcome our efficient market overlords!

Where did you get the impression that European countries do this on a large enough scale to matter*? There are separate bike roads in some cities, but they tend to end abruptly and lead straight into traffic at places where nobody expects cyclists to appear or show similar acts of genius in their design. If you photograph just the right sections, they definitely look neat. But integrating car and bike traffic in a crowded city is a non-trivial problem; especially in Europe where roads tend to follow winding goat paths from the Dark Ages and are way too nar... (read more)

-1taryneast8y
Where did you get the impression that by "it's far safer" that I meant "it's far safer... than driving"? i am completely ignoring your anecdotes - they cannot be taken for actual data. I have friends that have been in extremely dangerous car accidents. I have a friend who was killed in a car crash. Anecdotes are a bad idea on this. I'd be happy with real data on the actual base rates of this stuff, and yes, perhaps the bike lanes are not sufficient to overcome the danger of riding off the bike lane. But I don't think it's quite as bad as you're making out. It definitely depends on where you need to get to by bike... but my experience with riding in Perth was that I could ride from the outer suburbs to the city without going through traffic. The same for large portions of Sydney (once you hit the main bike routes along the freeways). If you're riding into the CBD, but get off your bike before hitting the main CBD streets themselves (ie choose your route carefully), then you can get to a goodly portion of the city without hitting the (I agree) utterly ridiculous bad bike lanes ...and that's before even considering Europe. But yeah, if you have some real data, I'm happy to change my mind.

I know you intended your comment to be a little tongue-in-cheek, but it is actual energy, measured in Joules, we're talking about. Exerting willpower drains blood glucose levels.

I don't know of studies that indicate intraverts would drain glucose faster than extraverts when socializing, but that seems to be a pretty straightforward thing to measure, and I'd look forward to the results. At least, i can tell from personal experience that I need to exert willpower to stay in social situations (especially when there are lots of people close by or when it's lou... (read more)

2Richard_Kennaway8y
Some doubt has been cast on that theory (Googling /willpower glucose/ turns up various papers for and against), but besides that, someone reporting sensations is not reporting the physiological causes of those sensations, even if they have a belief about what those causes are. There's an annual 100 mile bicycle ride at my home town [http://www.edp24.co.uk/news/can_you_spot_yourself_in_our_norwich_100_bike_ride_gallery_1_4094173] that gets above 3000 participants every year. There are 50 and 25 mile options, and perhaps only a minority do the full 100, but it's still a sizable number. Anything that one is serious about wanting to do, one will exert as great an effort as required. "Having to exert willpower" sounds more like not actually wanting to do whatever it is but grinding on with it anyway. It's the activity that's unenjoyed, rather than the effort.

There's another argument I think you might have missed:

Utilitarism is about being optimal. Instinctive morality is about being failsafe.

Implicit in all decisions is a nonzero possibility that you are wrong. Once you take that into account, having some "hard" rules like not agreeing to torture here (or in other dilemmas), not pushing the fat guy on the tracks in the trolley problem, etc, can save you from making horrible mistakes at the cost of slightly suboptimal decisions. Which is, incidentally, how I would want a friendly AI to decide as well... (read more)

Exactly. Stocks are almost always better long-term investments than anything else (if mixed properly; single points of failure are stupid). The point of mixing in "slow" options like bonds or real estate is that it gives you something to take money out of when the stocks are low (and replenish it when the stocks are high). That may look suboptimal, but still beats the alternatives of borrowing money to live from or selling off stocks you expect to rise mid-term. The simulation probably does a poor job of reflecting that.

0PhilGoetz8y
That's no reason to tell someone with hundreds of thousands of dollars to put half of it in bonds. The market isn't going to stay down for 10 years.

Intelligence is basically how quickly you learn from experience, so being smart should allow you to get to the same level with much less time put in (which seems to be what the OP is hinting at). I'd also expect diminishing returns, especially if you always socialize with the same (type of) people. At some point, each social group (or even every single person) becomes a skill of its own. Once your generic social skills are at an acceptable level, pick your specializations carefully. Life is too short to waste it on bad friends.

0[anonymous]8y
I am not at all sure it is true (at the very least it depends on the type of learning, intelligent people often do not learn music or sports movement fast) but to the extent it is true, it would be then very useful to send highly intelligent people through the equivalent of an obstacle course where they gain a lot of different type of experience, like a month long summer camp for gifted students with a different profession or difficult activity tried every day. Because, you see, we tend to have literally the opposite. Usually the highly intelligent are shut-in savants who have hardly any experience that does not involve a computer, books or paper.

My thoughts exactly. The first commandment of multiclassing in 3rd is "Thou shalt not lose caster levels". Also, Wizards are easily the most OP base class, if played well. Multiclassing them into anything without wizard spell progression is just a waste.

OTOH, using gestalt rules to make a Wizard//Rogue isn't half bad, even if a little short on HP and proficiencies. I prefer Barbarian or even the much ridiculed Monk in place of the Rogue.

1[anonymous]8y
No offense to you guys, but this is why I don't play RPGs with other people. Instead of playing a role almost everybody is trying to make "efficient" "overpowered" characters as if it was some sort of a competition which you can win. I think entirely the other way around, I would make my character a wizard because and only because this career choice matches his personality, background and so on, and multiclass only when it looks like my char really would. And would not give no heed to efficiency and power. It would be the DMs job to match difficulty level to our characters, not the other way around. I will have to invent an RPG where all armor has the same AC, all weapons the same damage, so that players don't try to make overpowered optimization monsters but plain simply choose whatever matches a characters style, background, culture, or the players general sense of coolness. Thus, for example, a player would be comfortable with a fighter character that wears no armor and carries only a rapier because he is a D'Artagnan type swashbuckler, that is his personality, background and style.

I suppose you already drew the obvious conclusion, but I still think it's worth spelling out:

The key to people liking you is making sure they feel good when you're around. Causality is secondary.

A quick google search found this:

Emma Chapman, Simon Baron-Cohen, Bonnie Auyeung, Rebecca Knickmeyer, Kevin Taylor & Gerald Hackett (2006) Fetal testosterone and empathy: Evidence from the Empathy Quotient (EQ) and the “Reading the Mind in the Eyes” Test, Social Neuroscience, 1:2, 135-148, http://dx.doi.org/10.1080/17470910600992239

I can't find a citation for the whole story right now, but as I remember it, it goes something like this: When the first wave of testosterone hits a male fetus, it kills off well over 80% of the brain cells responsible for e... (read more)

Only say things that can be heard. If you can anticipate that you are too many inferential steps away, you should talk about something else. Which means in this case: Be patient and build their knowledge from the bottom, not from the top.

If you have already started and notice the problem too late, yeah, you're kinda screwed. The honest answer seems pretty rude, and not saying anything is worse. I'd probably try to salvage what I still can by saying something along the lines of "I know this is a complicated and confusing issue, and it takes a while to ... (read more)

There is also something else going on here, which I realized after learning about personality types, especially Jung's theories and the Myers-Briggs Type Indicator. One dimension separates along the primary mode of seeing the world (Sensing vs iNtuitive), with the former ones collecting individual facts and strictly following isolated rules, and the latter ones always looking for the generalized principle behind the facts and questioning the origin and sense of rules.

These two types have a lot of trouble understanding each others' way of thinking and frequ... (read more)

Then he asked the wrong question. Straight up asking "Ougi, why did you decide on a formal dress code when this apparently has no meaning for your teachings?" is a different question from "Does wearing robes make us a cult?", and shows a different understanding of what the robes mean. The answer would still be deliberately confusing and enigmatic, but that's kinda the whole point of a koan.

Danger, wild speculation ahead: I'd assume it has something to do with the saying "Engineers can't lie." I can imagine constantly experiencing that doing things in violation with reality leads to failure, while at the same time hearing politicians lie pretty much every time they open their mouth and having them get elected again and again (or not failing in another way), to make quite a few of them seriously fed up with the current government in particular and humanity in general. Some less stable personalities might just want to watch the world burn at that point. Which should make them recruitable as terrorists, if you use the right sales pitch.