jacoblyles

Posts

Sorted by New

Comments

Our Phyg Is Not Exclusive Enough

It's true that lots of Utilitarianisms have corner cases where they support action that would normally considered awful. But most of them involve highly hypothetical scenarios that seldom happen, such as convicting an innocent man to please a mob.

The problem with LW/SIAI is that the moral monstrosities they support are much more actionable. Today, there are dozens of companies working on AI research. LW/SIAI believes that their work will be of infinite negative utility if they are successful before Eliezer invents FAI theory and he convinces them that he's not a crackpot. The fate of not just human civilization, but all of galactic civilization is at stake.

So, if any of them looks likely to be successful, such as scheduling a press conference to announce a breakthrough, then it's straightforward to see what SI/LW thinks you should do about that. Actually, given the utilities involved, a more proactive strategy may be justified, if you know what I mean.

I'm pretty sure this is going to evolve into an evil terrorist organization, and would have done so already if the population weren't so nerdy and pacifistic to begin with.

And yes, there are the occasional bits of cautionary principles on LW. But they are contradicted and overwhelmed by "shut up and calculate", which says trust your arithmetic utilitarian calculus and not your ugh fields.

Our Phyg Is Not Exclusive Enough

Oh sure, there are plenty of other religions as dangerous as the SIAI. It's just strange to see one growing here among highly intelligent people who spend a ton of time discussing the flaws in human reasoning that lead to exactly this kind of behavior.

However, there are ideologies that don't contain shards of infinite utility, or that contain a precautionary principle that guards against shards of infinite utility that crop up. They'll say things like "don't trust your reasoning if it leads you to do awful things" (again, compare that to "shut up and calculate"). For example, political conservatism is based on a strong precautionary principle. It was developed in response to the horrors wrought by the French Revolution.

One of the big black marks on the SIAI/LW is the seldom discussed justification for murder and terrorism that is a straightforward result of extrapolating the locally accepted morality.

Our Phyg Is Not Exclusive Enough

Nevermind the fact that LW actually believes that uFAI has infinitely negative utility and that FAI has infinitely positive utility (see arguments for why SIAI is the optimal charity). That people conclude that acts that most people would consider immoral are justified by this reasoning, well I don't know where they got that from. Certainly not these pages.

Ordinarily, I would count on people's unwillingness to act on any belief they hold that is too far outside the social norm. But that kind of thinking is irrational, and irrational restraint has a bad rep here ("shut up and calculate!")

LW scares me. It's straightforward to take the reasoning of LW and conclude that terrorism and murder are justified.

Cult impressions of Less Wrong/Singularity Institute

We should try to pick up "moreright.com" from whoever owns it. It's domain-parked at the moment.

Muehlhauser-Wang Dialogue

The principles espoused by the majority on this site can be used to justify some very, very bad actions.

1) The probability of someone inventing AI is high

2) The probability of someone inventing unfriendly AI if they are not associated with SIAI is high

3) The utility of inventing unfriendly AI is negative MAXINT

4) "Shut up and calculate" - trust the math and not your gut if your utility calculations tell you to do something that feels awful.

It's not hard to figure out that Less Wrong's moral code supports some very, unsavory, actions.

Who Wants To Start An Important Startup?

Fortunately, the United States has a strong evangelical Christian lobby that fights for and protects home schooling freedom.

Who Wants To Start An Important Startup?

...And you just blew your cover. :)

Nobody of any importance reads Less Wrong :)

What is moral foundation theory good for?

I'm pretty sure they are sourced from census data. I check the footnotes on websites like that.

Who Wants To Start An Important Startup?

Tagline: Coursera for high school

Mission: The economist Eric Hanushek has shown that if the USA could replace the worst 7% of K-12 teachers with merely average teachers, it would have the best education system in the world. What if we instead replaced the bottom 90% of teachers in every country with great instruction?

The Company: Online learning startups like Coursera and Udacity are in the process of showing how technology can scale great teaching to large numbers of university students (I've written about the mechanics of this elsewhere). Let's bring a similar model to high school.

This Company starts in the United States and ties into existing home school regulations with a self-driven web learning program that requires minimum parental involvement and results in a high school degree. It cloaks itself as merely a tool to aid homeschool parents, similar to existing mail-order tutoring materials, hiding its radical mission to end high school as we know it.

The result is high-quality education for every student. In addition to the high quality, it gives the student schedule flexibility to pursue other interests outside of high school. Many exceptional young people I know dodge the traditional schools early in life. This product gives everyone that opportunity.

By lowering the cost of going home-school, this product will enlargen the home school market and threaten traditional educrats while producing more exceptional minds.

With direct access to millions of students, the website will be able to monetize through one-on-one tutoring markets, college prep services, and other means.

Course material can be bootstrapped by constructing a curriculum out of free videos provided through sources like the Khan Academy. The value-add of the Company will be to tailor the curriculum to the home-school requirements of the particular state of the student.

My background: I cofounded a company that's had reasonable success. I'm not much of a Less Wrong fan - I find the community to be an intellectual monoculture, dogmatic, and full of blind spots to flaws in the philosophy it preaches. BUT this is an idea that needs to happen, as it will provide much value to the world. Contact me at firstname lastname gmail if you have lots of money or can hack. Or hell, steal the idea and do it yourself. Just make it happen.

Load More