Posts

Sorted by New

Wiki Contributions

Comments

Harry seems to have neglected the possibility that the Philosopher's Stone is a general-purpose transmutation device, thus explaining why it would be able to produce both gold and the elixir of life.

And since Fullmetal Alchemist was plagiarized from wizard lore, you'd think this would be a reasonably common hypothesis.

My answer to question 3: The introduction of driverless cars needs to be sped up as quickly as possible. I think most people don't realize just how helpful these things will be. For starters, only a small fraction of those of us who used to own non-self-driving cars will need to own a self-driving car. Those who do own them will probably have them signed up with Uber or something, and their owners will be renting them out as driverless taxis when they don't need to use them personally. This means that taking a driverless taxi everywhere will be much cheaper than owning a car. Today, when you call a car on Uber, you pay for: the driver's labor, a fraction of the car, a fraction of the maintenance, the fuel you use, plus a small cut for Uber itself (this would be very small in a hypothetical future with perfect competition). This adds up to quite a bit. Now what do you pay for by owning your own car? As much fuel and maintenance as you would use with Uber, but the entire cost of the car. Because of the cost of the driver, this winds up being cheaper at present. Take the driver out of the equation, and it's easy to see that a fraction of the car plus all the fuel and maintenance winds up being a hell of a lot cheaper. Driverless-Uber would be a bargain-basement option. And if the car is electric, with low maintenance and fuel costs, but an equal-to-higher cost of the vehicle itself, the ratio of the cost of the driverless taxi to the cost of owning your own car goes even lower. If the regulatory hurdles are all cleared, taking an electric driverless taxi in 2025 will be cheaper than just the gasoline you would burn traveling the same distance today.

And think of what the driverless taxi could do for traffic! The decrease in accidents will help some, but the real gains will come from sharing taxis. Imagine that 90% of cars on the road are driverless taxis, and you request a ride to point A. Chances are, a taxi already carrying someone else will soon be going by your way that is already on a route that will go near point A, or is going to some point B which is on the way to A. So the app that you use to flag down taxis asks you if you want to share the taxi that someone else is already using. If you and the other guy both agree, you get a reduced fare. You can check to see how other riders have rated this person as a co-passenger, and the fraction of the fare for which you are responsible is inversely related to how highly others rate you. You both accept, and now there is one car on the road carrying the passengers that would otherwise have required two or three or four cars. (Or more; the economics of driverless taxis might encourage multi-row limousines to become commonplace.)

Now imagine what this does in rush hour. If surge pricing is allowed, then prices track the current demand for transport. People who don't really need to be on the road at that time wait until it's cheaper, and those who do need to be on the road have a higher incentive to share a taxi with other commuters. This means that you might have a quarter as many cars on the road at rush hour carrying the same number of commuters. And the time spent commuting is no longer wasted, either—all that internet-browsing, book-reading and movie-watching that you used to do at home can now be done on the road. This enables us to live much further from city centers than we used to, on much cheaper land where crime is lower and a middle class person can have 20 acres all to himself. Land in the suburbs and cities will be used more efficiently, since we no longer need all those stupid parking lots. Houses can be built more cheaply, too: we no longer need driveways or garages.

It won't just revolutionize short-distance travel either. Since you can sleep in the car, a long-distance trip can be taken overnight with very little lost time. It'll be like shaving eight hours off of any trip you take, which means that driving will be more convenient than flying in many cases. Once you factor in the time it takes to go to and from an airport, it probably won't be any faster to fly unless you'd need to drive for 14 hours or so. (And driverless cars might safely drive at 120mph or more.) When you do need to take a plane, connecting flights will be a thing of the past. Now, you just have a car drive you overnight to an airport from where you can get a direct flight to your destination.

Parenting will also be made much easier by the driverless taxi. Children over the age of ten or so don't need constant supervision, but the do need a stay-at-home parent to take care of various chores that only an adult can do: buying groceries, ferrying kids to soccer practice, etc. But the only reason that many of these chores need an adult in the first place is because kids can't drive. They could, however, take driverless taxis. They can do the shopping themselves, and they can take a taxi to wherever they need to go. This means that the stay-at-home spouse can re-enter the workforce full time much earlier than would otherwise be possible and children can have much more freedom of movement than they otherwise would, they no longer having to depend on the availability and goodwill of a parent to take them places.

The driverless car is going to deliver major fundamental improvements to civilization itself. Bringing it about as soon as possible should be our single highest political priority at the moment.

Musk knows Peter Thiel from their days at PayPal, and Thiel is MIRI's biggest patron (or was, last I heard)—so it's hardly surprising that Musk is familiar with the notion of X-risk from unfriendly AI.

Lying constantly about what you believe is all well and good if you have Professor Quirrell-like lying skills and your conscience doesn't bother you if you lie to protect yourself from others' hostility to your views. I myself lie effortlessly, and felt not a shred of guilt when, say, I would hide my atheism to protect myself from the hostility of my very anti-anti-religious father (he's not a believer himself, he's just hostile to atheism for reasons which elude me).

Other people, however, are not so lucky. Some people are obliged to publicly profess belief of some sort or face serious reprisals, and also feel terrible when they lie. Defiance may not be feasible, so they must either use Dark Side Epistemology to convince themselves of what others demand they be convinced, or else be cursed with the retching pain of a guilty conscience.

If you've never found yourself in such a situation, lucky you. But realize that you have it easy.

If a first-world country suffers a calamity in which half its population dies, it'll lose nine-tenths of its economic output at least.

Too valuable in the current economy to measure in small quantities, sure. But in a postapocalyptic wasteland, the economy will have shrunk drastically while the available quantity of gold stays the same. Hence, gold is the new silver and silver is the new tin.

Should civilization collapse to the point of law enforcement and electronic banking no longer functioning, I suspect gold in small denominations would be more useful than cash. You should also have acid handy to prove the authenticity of your gold and to test the authenticity of others'.

Do you consider food, shelter, and clothing to be optional? You know those things cost money, right?

I use two spaces after every sentence, and I'm 23. It's not a personal quirk either, it was just normal formatting in the American public schools I attended. (By the way, anyone who points out that this very post uses single spaces after a full stop should know that LessWrong messes with formatting. I typed double spaces; it's just not displaying as written.)

Let me attempt to convince you that your resurrection from cryonic stasis has negative expected value, and that therefore it would be better for you not to have the information necessary to reconstruct your mind persist after the event colloquially known as "death," even if such preservation were absolutely free.

Most likely, your resurrection would require technology developed by AI. Since we're estimating the expected value of your resurrection, let's work on the assumption that the AGI will be developed.

Friendly AI is strictly more difficult to develop than AI with values orthogonal to ours or malevolent AI. Because the FAI developers are at such an inherent disadvantage, AGI tech will be most used by those least concerned with its ethical ramifications. Most likely, this will result in the extinction of humanity. But it might not. In the cases where humanity survives but technology developed by AGI continues to be used by those who are little concerned with its ramifications, it would be best for you not to exist at all. Since those with moral scruples would be the most averse to wantonly duplicating, creating, or modifying life, we can assume that those doing such things most often will be vicious psychopaths (or fools who might as well be), and that therefore the amount of suffering in the world inflicted on those synthetic minds would greatly outweigh any increased happiness of biological humans. A world where a teenager can take your brain scan remotely with his iPhone in the year 2080 and download an app that allows him to torture an em of you for one trillion subjective years every real second is a world in which you'd be best off not existing in any form. Or you could find yourself transformed into a slave em forced to perform menial mental labor until the heat death of the universe.

Likely? No. More likely than FAI taking off first, despite the massive advantage the unscrupulous enjoy in AGI development? I think so. Better to die long before that day comes. For that matter, have yourself cremated rather than decaying naturally, just in case.

Load More