rwallace

Comments

Slowing Moore's Law: Why You Might Want To and How You Would Do It

This used to be an interesting site for discussing rationality. It was bad enough when certain parties started spamming the discussion channel with woo-woo about the machine Rapture, but now we have a post openly advocating terrorism, and instead of being downvoted to oblivion, it becomes one of the most highly upvoted discussion posts, with a string of approving comments?

I think I'll stick to hanging out on sites where the standard of rationality is a little better. Ciao, folks.

Open Thread, February 15-29, 2012

Example: Most people would save a young child instead of an old person if forced to choose, and it is not not just because the baby has more years left, part of the reason is because it seems unfair for the young child to die sooner than the old person.

As far as I'm concerned it is just because the baby has more years left. If I had to choose between a healthy old person with several expected years of happy and productive life left, versus a child who was terminally ill and going to die in a year regardless, I'd save the old person. It is unfair that an innocent person should ever have to die, and unfairness is not diminished merely by afflicting everyone equally.

Diseased disciplines: the strange case of the inverted chart

That would be cheap and simple, but wouldn't give a meaningful answer for high-cost bugs, which don't manifest in such small projects. Furthermore, with only eight people total, individual ability differences would overwhelmingly dominate all the other factors.

Scott Sumner on Utility vs Happiness [Link]

Sorry, I have long forgotten the relevant links.

Diseased disciplines: the strange case of the inverted chart

We know that late detection is sometimes much more expensive, simply because depending on the domain, some bugs can do harm (letting bad data into the database, making your customers' credit card numbers accessible to the Russian Mafia, delivering a satellite to the bottom of the Atlantic instead of into orbit) much more expensive than the cost of fixing the code itself. So it's clear that on average, cost does increase with time of detection. But are those high-profile disasters part of a smooth graph, or is it a step function where the cost of fixing the code typically doesn't increase very much, but once bugs slip past final QA all the way into production, there is suddenly the opportunity for expensive harm to be done?

In my experience, the truth is closer to the latter than the former, so that instead of constantly pushing for everything to be done as early as possible, we would be better off focusing our efforts on e.g. better automatic verification to make sure potentially costly bugs are caught no later than final QA.

But obviously there is no easy way to measure this, particularly since the profile varies greatly across domains.

Diseased disciplines: the strange case of the inverted chart

Because you couldn't. In the ancestral environment, there weren't any scientific journals where you could look up the original research. The only sources of knowledge were what you personally saw and what somebody told you. In the latter case, the informant could be bullshitting, but saying so might make enemies, so the optimal strategy would be to profess belief in what people told you unless they were already declared enemies, but base your actions primarily on your own experience; which is roughly what people actually do.

Open Thread, February 1-14, 2012

That's not many worlds, that's quantum immortality. It's true that the latter depends on the former (or would if there weren't other big-world theories, cf. Tegmark), but one can subscribe to the former and still think the latter is just a form of confusion.

I've had it with those dark rumours about our culture rigorously suppressing opinions

True. The usual reply to that is "we need to reward the creators of information the same way we reward the creators of physical objects," and that was the position I had accepted until recently realizing, certainly we need to reward the creators of information, but not the same way - by the same kind of mechanism - that we reward the creators of physical objects. (Probably not by coincidence, I grew up during the time of shrink-wrapped software, and only re-examined my position on this matter after that time had passed.)

I've had it with those dark rumours about our culture rigorously suppressing opinions

To take my own field as an example, as one author remarked, "software is a service industry under the persistent delusion that it is a manufacturing industry." In truth, most software has always been paid for by people who had reason other than projected sale of licenses to want it to exist, but this was obscured for a couple of decades by shrinkwrap software, shipped on floppy disks or CDs, being the only part of the industry visible to the typical nonspecialist. But the age of shrinkwrap software is passing - outside entertainment, how often does the typical customer buy a program these days? - yet the industry is doing fine. We just don't need copyright law the way we thought we did.

I've had it with those dark rumours about our culture rigorously suppressing opinions

We can't. We can only sensibly define them in the physical universe which is based on matter, with its limitations of "only in one place at a time" and "wears out with use" that make exclusive ownership necessary in the first place. If we ever find a way to transcend the limits of matter, we can happily discard the notion of property altogether.

Load More