Sorted by New

Wiki Contributions


Typically, as long as the expense is deemed prudent by regulators, utilities are permitted to 'rate base' the expense and earn a return on investment. If PG&E think it's politically possible to increase expenses by $20-30B because there's a good narrative to offset complaints of rising utility prices, it's the selfish thing to do. The times that require strict scrutiny for investor-owned utilities is when they jump on the bandwagon of a politically popular spending proposal (wise infrastructure investments comes from experts getting the politicians on board, not politicians getting the experts on board).

Liberals see the free market as a kind of optimizer run amuck, a dangerous superintelligence with simple non-human values that must be checked and constrained by the government - the friendly SI. Conservatives just reverse the narrative roles.

I like this analogy. So basically, how do you want to balance the power between your two overlords, one much much smarter than you but with non-human values, and the other much dumber than you but with human (mostly) values.

Thanks. I'd love to share this material with people but the format makes it hard as many people seem to have an aversion to a collection of blog posts. I look forward to buying the book so I can loan it to people.


Competed the survey. Thanks for doing this, the results are always interesting.


Does anyone know what happened to the version that was supposed to be reviewed/edited down by a professional so it could be publishable length? There's so much good stuff there I'd love to be able to send to friend and family but 500k worth of blog posts is much harder to send someone than a nicely published 200k version.


Yes but as soon as you thought of it it becomes a known known :)


The "known knowns" quote got made fun of a lot, but I think it's really good out of context:

"There are known knowns; there are things we know that we know. There are known unknowns; that is to say, there are things that we now know we don't know. But there are also unknown unknowns – there are things we do not know we don't know."

Also, every time I think of that I try to picture the elusive category of "unknown knowns" but I can't ever think of an example.


I deconverted in large part because of Less Wrong. Looking back at it now, I hadn't had a strong belief since I was 18 (by which I mean, if you asked most believers what the p(god) is they'd say 100% whereas I might have said 90%) but that might just be my mind going back and fixing memories so present me thinks better of past me.

I'd be happy to do an AMA (I went from Mormon to Atheist) but a couple of the main things that convinced me were:

  • Seeing that other apologists could make up similar arguments to make just about anything look true (for example, other religious apologists, homeopathy, anti-vaccines, etc)

  • Seeing the evidence for evolution and specifically, how new information supports true things. That showed me that for true things, new information doesn't need to be explained away, but actually supports the hypothesis. For example, with evolution discoveries such as carbon dating, the fossil record, and DNA all support it. Those same discoveries have to be explained away via apologetics for religions.

  • Bayesian thinking. I have an econ background so kind of did this informally but the emphasis from less wrong that once you see evidence against you need to actively lower your probability a bit really helped me. Before I'd done what EY pointed out where you take all of your evidence for and stacked that against this one evidence against and then when the next evidence against comes along you take all your evidence for and stack it against that one evidence, etc.

  • The value that I want to believe what is true. I had this before but wasn't as proactive about it.

  • Before I felt like my belief system was logical and fit the evidence and if someone didn't believe it was because they hadn't looked at the evidence and fairly considered it. Seeing people look at the evidence and then cogently explain why they still didn't believe gave me a "I notice I'm confused" moment.\

  • etc.


My take away from this is that you need to "shut up and multiply" every single time. Looking at the math skills study, the thought was that people glance at the raw numbers (instead of looking at the ratios) and stop there if they fit their ideological beliefs. If it conflicts with your beliefs though you spend a little longer and figure out you need to look a the ratio. So if we train ourselves to always "shut up and multiply" hopefully some of this effect will go away. Maybe a follow-up study to see if people who actually do the math still get it wrong?


I'm already booked that day but if it's a weekly thing I wouldn't mind stopping by sometime.

Load More