Liam Goddard

I am a person. Who exists. Not sure what else to add to my profile... um, maybe the fact that I ate an orange 6 hours ago... This profile is over.

Liam Goddard's Comments

Multiple Moralities

All of this makes a lot of sense when it comes to rules for society. And I understand that certain forms of government, or certain laws, would be effective for almost any utility function. What I’m questioning isn’t how you achieve your goals, it’s where goals themselves come from, your terminal values.

[Question] When Do Unlikely Events Should Be Questioned?

The quote comes from HPMOR, when Harry gets his wand and it shares Voldemort’s core.

Harry Potter and the Methods of Rationality discussion thread, February 2015, chapters 105-107

Due to the magic resonance, Quirrell can’t cast spells on Harry, so the test wasn’t faked.

A rational unfalsifyable believe

I definitely think that “Adam did not kill him” would be an accurate and rational belief, but there still would technically be some evidence that could convince her otherwise (such as by using a time machine.) Therefore the probability she should hold of that belief should not be quite 100%, though very close. But another important part of unfalsifiable is that she COULD have been convinced otherwise, that had there been different evidence she would have thought Adam to be the killer. The most important thing here is that beliefs are probabilistic. It is quite possible for a perfect Bayesian to believe something and to think it possible that they could encounter evidence which persuades them otherwise. Eve should hold a high, but not 100%, probability that Adam was innocent. I don’t see how any of this could apply to theism, though, since theism isn’t founded on much evidence.

Honoring Petrov Day on LessWrong, in 2019

Do you really think he would care enough about three hours of LessWrong shutdown to write the chapter?

Honoring Petrov Day on LessWrong, in 2019

The button appeared on my screen, but I'm new to Less Wrong, definitely do not have 1000+ karma, and didn't get an email... did everyone see it for some reason?

The AI in a box boxes you

Can't you just pull the plug before it can run any simulations?

Open & Welcome Thread - September 2019

I saw two more posts that I’ve already read. I have Unread Only checked. I think there’s some problem with the unread filter.

Simulate and Defer To More Rational Selves

Several years ago, back before I deconverted and learned about Less Wrong, I sometimes used this without trying- I would pray to "God," and "God" would usually make better decisions than my intuitive judgements- not because of a higher power (it would be impossible to simulate a being more intelligent than myself) but because I was really simulating myself, minus several cognitive restraints. After I left religion, I stopped doing that anymore, because I basically thought praying was beneath me- although now that I've read this post, I will start doing it again. But recently, I've been doing something similar. I simulate a being who has never encountered our universe before and I explain to it various aspects of ordinary life, finding what does and doesn't make sense. There have been some interesting reactions, such as: "But why would they believe without evidence?" "They insist that they can rely on faith-" "Don't use the f-word!" or "You're telling me that people decide who they are going to spend their entire lives with based on WHO THEY WANT TO HAVE SEX WITH?" It can be pretty helpful.

Load More