LESSWRONG
LW

JStewart
2401340
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No wikitag contributions to display.
Some recent evidence against the Big Bang
JStewart11y180

Is this not kosher? The minimum karma requirement seems like an anti-spam and anti-troll measure, with the unfortunate collateral damage of temporarily gating out some potential good content. The post seems clear to me as good content, and my suggestion to MazeHatter in the open thread that this deserved its own thread was upvoted.

If that doesn't justify skirting the rule, I can remove the post.

Reply
Open thread Jan. 5-11, 2015
JStewart11y-10

I've posted it here.

Reply
Open thread Jan. 5-11, 2015
JStewart11y90

I think you should post this as its own thread in Discussion.

Reply
Open thread, Nov. 24 - Nov. 30, 2014
JStewart11y90

This has been proposed before, and on LW is usually referred to as "Oracle AI". There's an entry for it on the LessWrong wiki, including some interesting links to various discussions of the idea. Eliezer has addressed it as well.

See also Tool AI, from the discussions between Holden Karnofsky and LW.

Reply
2014 Less Wrong Census/Survey
JStewart11y360

Count me surveyed.

Reply
Rationality Quotes May 2013
JStewart12y00

Interesting. I wonder to what extent this corrects for people's risk-aversion. Success is evidence against the riskiness of the action.

Reply
Circular Preferences Don't Lead To Getting Money Pumped
JStewart13y10

Having circular preferences is incoherent, and being vulnerable to a money pump is a consequence of that.

I knew that if I had 0.95Y I would trade it for (0.95^2)Z, which I would trade for (0.95^3)X, then actually I'd be trading 1X for (0.95^3)X, which I'm obviously not going to do.

This means that you won't, in fact, trade your X for .95Y. That in turn means that you do not actually value X at .9Y, and so the initially stated exchange rates are meaningless (or rather, they don't reflect your true preferences).

Your strategy requires you to refuse all trades at exchange rates below the money-pumpable threshold, and you'll end up only making trades at exchange rates that are non-circular.

Reply
The noncentral fallacy - the worst argument in the world?
JStewart13y100

Judging from the comments this is receiving on Hacker News, this post is a mindkiller. HN is an audience more friendly to LW ideas than most, so this is a bad sign. I liked it, but unfortunately it's probably unsuitable for general consumption.

I know we've debated the "no politics" norm on LW many times, but I think a distinction should be made when it comes to the target audience of a post. In posts aimed to make a contribution to "raising the sanity waterline", I think we're shooting ourselves in the foot by invoking politics.

Reply
A Primer On Risks From AI
JStewart13y40

I like the combination of conciseness and thoroughness you've achieved with this.

There are a couple of specific parts I'll quibble about:

Therefore the next logical step is to use science to figure out how to replace humans by a better version of themselves, artificial general intelligence.

"The Automation of Science" section seems weaker to me than the others, perhaps even superfluous. I think the line I've quoted is the crux of the problem; I highly doubt that the development of AGI will be driven by any such motivations.

Will we be able to build an artificial general intelligence? Yes, sooner or later.

I assign a high probability to the proposition that we will be able to build AGI, but I think a straight "yes" is too strong here.

Reply
A Primer On Risks From AI
JStewart13y20

Out of curiosity, what are your current thoughts on the arguments you've laid out here?

Reply
Load More
10Some recent evidence against the Big Bang
11y
66