"God of the EA community"? The majority of my city's EA community doesn't even know who Yudkowsky is, and of the few who do most have ambivalent opinions of him.
The primary goal of this document is to articulate my personal moral philosophy, and I use the Mohism branding because it has strong corollaries to said moral philosophy, but otherwise I am reinventing it from scratch.I do think that a lot of the core tenets are widely (if subconsciously) held. As for the ones that aren't widely held, I personally think they should be. But, like any good Neo-Mohist, I'm willing to be convinced otherwise. ;)The phrasing of this as a philosophy for others to adopt is mostly an aesthetic decision, a reframing to help me look at it more critically.
Thanks for the feedback! Basing it on Mohism is more of an aesthetic decision than anything; if classical Mohism has an issue then Neo-Mohism should set out to solve it. :)I think there's a difference between "no fixed standards" and "the ability to update standards in light of new evidence". Neo-Mohism is definitely about "strong opinions, weakly held" kind of thing. The standards it sets forth are only to be overturned by failing a test, and until then should be treated as the best answer so far.
If you would like to attend a Guild mixer to meet the Council and some of the students, come join us saturday! We expect to do this on a monthly or quarterly basis.https://bit.ly/3lOco5O
Never mind, I found your calendly. Got us scheduled for friday. :)
I would like to get set up! :)
Please note that we have added a Google Form for registration, to make sure we have enough food.
No, the best way to convince me is to show me data. Evidence I can actually update on, instead of self-reporting on results that may be poisoned by motivated reasoning, or any number of other biases. Data I can show to people who know what they are talking about, that they will take seriously.
I see Bayesian Rationality as a methodology as much as it is a calculation.
It's being aware of our own prior beliefs, the confidence intervals of those beliefs, keeping those priors as close to the base rates as possible, being cognizant of how our biases can influence our perception of all this, trying to mitigate the effects of those biases, and updating based on the strength of evidence.
I'm trying to get better at math so I can do better calculations. It's a major flaw in my technique I acknowledge and am trying to change.
But as you noted earlier, non
A strong correlation between adopting the virtues and established methods of rationality, and an increased quality of life, but yeah; more handwavey.
I don't even know what calculations could be made. That's sorta why I'm here.
Yes, but they could all be explained by the fact I just sat down and bothered to think about the problem, which wouldn't exactly be an amazing endorsement of rationality as a whole.
I also don't look at rationality as merely a set of tools; it's an entire worldview that emphasizes curiosity and a desire to know the truth. If it does improve lives, it might very well simply be making our thinking more robust and streamlined. If so, I wouldn't know how to falsify or quantify that.
I ask the question this way to hopefully avoid stepping on toes. I'm fully open to the idea that the answer is "we have none".
Also, I am primarily addressing the people who are making a claim. I am not necessarily making a claim myself.
I'm convinced mostly due to its effects on my own life, as stated in the opening paragraph. But I'm unsure of how to test and demonstrate that claim. My question is for my benefit as well as others.
I just realized that I work tomorrow, so we are not doing the hike. Instead, we are doing our usual 6pm meetup at the Johnson County Central Resource Library. We will do our hike next week (June 25th, 9am, Shawnee Lake Dog Park).
If I find that it does have actual impact on the podcast's effectiveness, then I absolutely will seriously consider changing it. Your criticism has updated me marginally in that direction, but it's not quite enough for me to act on it, particularly since you're the only person to mention it.
Thank you for your feedback!
I'm sure that there are Street Epistemologists that are guilty of this, but that's literally opposite of what I encourage or practice.
At its core, SE is merely coaching people in asking the Fundamental Question of Rationality.
As an SE-er, it's my way of Raising the Sanity Waterline.
It's excellent at circumventing the Backfire Effect.
There are as many motivations for SE as there are practitioners.
The Bayesian Conspiracy needs to be updated with Jess as a new host. :)
I didn't even know about this resource. Thanks!
I will be sure to include a transcript in all future episode descriptions/show notes.
Done! Link to the transcript has been posted in the description, and also here: https://docs.google.com/document/d/1MjTM4revF1upDvO00y0v8jF8G6HUbcABtFDxVYiLyPc/edit?usp=sharing
Is there a different venue/format for the notes you had in mind?
I'll do that tonight!
Due to the nature of assembling a bug list, there might not be a whole lot to discuss and do. If we are finished significantly early, we might simply move on to Day 2: 'Yoda Timers'
The Kansas City Rationalists are putting together a dojo, for the purpose of improving our cognitive abilities, happiness, and efficiency.
For content, we will be using the 'Hammertime' sequence. Attendees are expected to read the introduction ('Hammers and Nails') and Day 1 ('Bug Hunt'), as well as put together their bug list. The meeting will consist of meta-discussion about the content, and discussion about our experience putting together our bug lists. Bonus points if you are willing to share the bugs you found!
We will be meeting weekly, at the same time and location.
It works perfectly. Thanks again!
That looks perfect! I'll test it out later.
My IRL rationality group is preparing to test that sequence. It looks promising, although we do have some quibbles with it. If we successfully finish testing, we'll publish the details.
This may seem stupid, but I didn't even think about doing odds calibration on such a small scale. That's a great idea. Making a pinned Google Keep note now.
I was thinking "Tsuyoku Naritai". 😊
If I understand you correctly, I wholeheartedly agree that "Less Wrong" is not just referring to a dichotomy of "what is true" and "what is wrong" (although that is part of it, ala 'Map and Territory').
There's a reason rationality is often called "systemized winning"; while the goals that you are trying to win are entirely subjective, rationality can help you decide what is most optimal in your pursuit of goal Y.
Thank you for the encouragement! I hope I didn't sound like I was fishing for that. I just wanted to emphasize my desire to seek any other more optimal paths.
Good idea! Get comfortable with my own journey before using someone else's. Makes sense.
Yes! The facebook event page links to our Facebook group.
Just listened to that about 10 minutes ago. Good sequence.
When you say "assumption", I hear "a thing that is accepted to be true".
What changes in cognition when something is accepted as true?
Please have a non-leather hardcover option for us vegans. :)
Please no real leather for us vegans. :)
No hardcover option?
Thanks for the feedback!
The most indiscrete example I have in mind would be to include it in the channel name; a clunky example would be 'Less Wrong Street Epistemology'.
I had no intention of using the official logo, merely make direct references.
Hi, Motasaurus. I certainly hope you stick around! Don't let our disagreements drive you off.
However, on that note, I'm afraid I would have to disagree. While I think you can have "better than average" epistemology and still be a Christian, perhaps even be in the top 25% percentile, I don't believe you can aspire to be a perfect Bayesian and still be a Christian.
I would respectfully point out that the Apostle John is hardly a neutral spectator in determining whether one can be both Christian and Rational. Additionally, he certainly didn't have access to an
Also, how would one go about acquiring these CFAR techniques? Is attending a workshop mandatory? I don't quite have the discretionary funds for that. :P
One of the hallmarks of a typical dojo is that the Sensei will demonstrate techniques, and show how they are supposed to look once you have mastered them.
Is it possible that this is an optional feature, if only for a rationality dojo?
That's very kind of you, thank you. It means a lot.
Yes, that is the correct sequence of events. I was raised in a Christian household, but the first belief I truly held for myself was atheism.
I am no longer a Christian or theist in any way.