Sorted by New

Wiki Contributions


Agree - talking things out, making everything as common knowledge as possible, and people who strongly value the harder path and who have resources committing some to fence off the worst cases of failure, seem to be necessary prerequisites to staghunting.

I have definitely gained from having a set bedtime, even in the time that I was in recovery for another run at polyphasic - having a time (01:00) when I am *in bed*, every day, makes life easier to plan, and the sleep itself better. (I'm fairly sure I can support this with studies.) It's been a good policy to start.

I've definitely benefited from having control over how much friction something had; adding a cost to visiting reddit made it less likely I'd waste time there all out of proportion to how large the cost actually was.

What do you think of the idea of an RPG type game where the quests are designed to trigger biases in people, and that required clear thinking to win at? I'd be a big fan of a game that required you to read quests and think about them, and moved away from the 'track arrow, kill everything en route' model that many have today. Of course, it still needs to be fun to entice people to play it. Functional edutainment seems to be a rough balance to strike.

Can I vote Discordianism? Knowing how silly it all is is a property of the text, Isn't that helpful?

Well, I'm not personally capable of building AI's, and I'm not as deeply versed as I'm sure many people here are, but, I see an implementation of Bayes theorem as a tool for finding truth, in the mind of a human or an AI or whatever sort of person you care to conceive of / display, whereas the mind behind it is an agent with a quality we might called directedness, or intentionality, or simply an interest to go out and poke the universe with a stick where it doesn't make sense. Bayes is in itself already math, easy to put into code, but we don't understand internally directed behavior well enough to model it, yet.

I think even a perfect implementation of Bayes would not in and of itself be an AI. By itself, the math doesn't have anything to work on, or any direction to do so. Agency is hard to build, I think.

As always, of course, I could be wrong.

Perhaps I missed yours? Rationality requires the ability to challenge social pressure, certainly. Are you questioning whether this procedure picks rationalists from nonrationalists? If so, and on its own, I don't argue that it would, just that it would probably be one member of a larger set of tests.

Nobody else seems to have added this response, so I will. We don't know that this moment, in the ritual room, is the only test they undergo. Perhaps one's ability to take a written exam is part of the public procedures. Perhaps a great open exam where anyone who wants to can sit it, running near continuously, is the first stage, and Brennan has had months in a cloisterlike environment in the public secret face of the conspiracy where the people who can study sciences but not generate new true science study?