Richard_Kennaway

Wiki Contributions

Comments

I agree long term reputation is valuable, but the hard question is "how valuable". It isn't priceless but yes, I agree it's possibly underrated by EAs. But like, when should we use it. What actions should it stop?

You can't buy reputation, as the OP pointed out, and you can't spend it either, e.g. by lending one's name to dodgy projects, or getting people to take a lie on trust. You use reputation by having it, and the OP described things flowing towards those of good reputation. The actions that maintaining your reputation should stop are those that would damage it. The question is rather, what qualities do EAs want themselves and the EA movement to have a reputation for?

Looking at the first fifteen posts, upvotes are correlated with how non- or anti-religious they are. The only one with negative karma is called "Religion is Good, Actually", and the religion there is Buddhism, the most un-religious religion of all.

I seem to be back to Emotivism when it comes to meta-ethics and I'm wondering if there's a way to be convinced otherwise.

One way — I do not here intend to speak for or against it — is to observe that there is a universal natural law written on our hearts, that it is impossible to not know (although it is possible to hide one's knowledge from oneself).

Here is J. Budziszewki (a Catholic, theologian, and scholar of Aquinas) on the subject:

However rude it may be these days to say so, there are some moral truths that we all really know—truths which a normal human being is unable not to know. They are a universal possession, an emblem of rational mind, an heirloom of the family of man. That doesn't mean that we know them with unfailing, perfect clarity, or that we have reasoned out their remotest implications: we don't, and we haven't. Nor does it mean that we never pretend not to know them even though we do, or that we never lose our nerve when told they aren't true: we do, and we do. It doesn't even mean that we are born knowing them, that we never get mixed up about them, or that we assent to them just as readily whether they are taught to us or not. That can't even be said of "two plus two is four" Yet our common moral knowledge is as real as arithmetic, and probably just as plain. Paradoxically, maddeningly, we appeal to it even to justify wrongdoing; rationalization is the homage paid by sin to guilty knowledge.

...

Interestingly, a part of the common moral sense is that there is a common moral sense. It is not only a recurring theme in philosophy, but a tradition in most cultures and a presupposition of both Jewish and Christian scriptures. Philosophers call this common sense the "natural" law to convey the idea that it is somehow rooted in how things really are. Chinese wisdom traditions call it the Tao; Indian, the dharma or rita. The Talmud says it was given to the "sons" or descendants of Noah, which means all of us. Abraham was so sure of it that he dared to debate with God. Saint Paul said that when Gentiles do by nature what the law requires, they show that its works are "written on their hearts".

C.S. Lewis has written the same, calling the things we can't not know the Tao.

ETA: an old comment of mine going into more detail on Lewis's Tao.

You cannot avoid starting somewhere, but that doesn't mean you can start anywhere and reality will never tell you otherwise.

"Life force", God sustaining his creation, and animism were once common beliefs, but the more we looked at the world, the less work they did, and they faded from the scene. Not an unexplained leap, but an explained and gradual change: wherever the searchlight of inquiry reached, we never saw them. God in the ever-diminishing gaps, "Sire, I had no need of that hypothesis".

The granting of such rights will be decided by people. It will happen when it is in the interests of the people having the power to make those decisions.

The advice I've seen elsewhere is that machines are all very well for specialised physiotherapy needs, but outside of that, always to use free weights or bodyweight exercises. The machines fix all the degrees of freedom but one, so you don't engage all the muscles needed to control the movement, so those muscles and the control they give you get no training. To avoid injury, don't use weights heavier than you can control. Don't expect to go from bench-pressing 60 kg on the machine to a 60 kg barbell, or from that to a pair of 30 kg dumbbells. Thus have I heard, but I am not an expert.

I'm unclear what the thrust of this article is intended to be. Are you predicting that such things will happen, or recommending that readers concerned with AI doom should encourage and fan the flames of such a movement?

The incredible ... seems like it might not be credible

It wouldn't be, would it?

I've seen various accounts (some on LessWrong) of people converting to a religion (usually Christianity) or staying within it despite being long in the rationalsphere. Common to all is that none of these people, including yourself, arrived there by the application of reason to evidence. What led you to "add that this low-level God has thought-like/human-like/emotion-like ideas"? In the article it comes as an unexplained leap.

I’m not sure about your distinction betwen transferrable and non-transferrable skills, or for that matter how much this comment affects the thrust of your post. But plenty of things can be taught, even though “the rational calculations and algorithms needed to perform it” are not “fully understood and documented”. Physical skills, for example: yoga, driving a car, sports of all sorts, the skills of drawing and painting. These things are learned not only by doing them, but by instruction and feedback from a teacher or coach.

ETA: Now, rationality skills, those seem to be a lot less transferrable than any of those I mentioned.

Load More