All of iarwain1's Comments + Replies

I really like Sean Carroll's The Big Picture as an intro to rationality and naturalism for the general public. It covers pretty much all the topics in RfAItZ, along with several others (esp. physics stuff). It's shorter and a lot less technical than RfAItZ, but it's readable and I thought it does a good job of laying out the basic perspectives.

Try 80,000 Hours' guide, especially here.

In our world, classical mechanics (Newton + Maxwell and their logical implications) holds for most everyday experiences at slow speeds (relative to the speed of light) and at scales larger than the atomic realm.*

Question: Is this necessarily true for every possible world that matches our macroscopic physical observations? Is it possible to construct an alternative set of physical laws such that the world would function exactly as our world does on a macroscopic, everyday level, but that would violate Newton's laws or Maxwell's laws or thermodynamics or the... (read more)

I may not understand the question's point, because as I read it the answer is a very obvious "Yes." We determined Newton's laws and Maxwell's equations from observations of our world. So the planets in orbit around the sun, the moon around the earth, and an apple falling to the ground all lead to gravitation. The attraction between wires carrying current in the same direction (magnetic), the functioning of transformers (change in magnetic field produces electric field) and radio and light all fit together to give Maxwell's equations. So yes, a world with the macroscopic physical observations as ours does not violate Newton's or Maxwell's laws because our world with those observations doesn't violate those laws. If Newton's or Maxwell's equations were different, the world you saw would necessarily be different. What am I missing here?

Check out 80,000 Hours. For finances in particular see their career review for trading in quantitative hedge funds.

Took survey. Didn't answer all the questions because I suspend judgment on a lot of issues and there was no "I have no idea" option. Some questions did have an "I don't have a strong opinion" option, but I felt a lot more of them should also have that option.

I'm more interested more in epistemic rationality concepts rather than practical life advice, although good practical advice is always useful.

I'm an undergrad going for a major in statistics and minors in computer science and philosophy. I also read a lot of philosophy and cognitive science on the side. I don't have the patience to read through all of the LW sequences. Which LW sequences / articles do you think are important for me to read that I won't get from school or philosophy reading?

One of the chief benefits of reading through the sequences is being able to notice, label, and communicate many different things. Instead of having a vague sense that something is wrong and having to invent an explanation of why on the spot, I can say "oh, there's too much inferential distance here" or "hmm, this argument violates conservation of expected evidence" or "but that's the Fallacy of Gray." But in order to have that ability, I need to have crystallized each of those things individually, so that I can call on it when necessary. But if you're only going to read one thing, A Human's Guide to Words (start here) is probably going to be the most useful, especially going into philosophy classes.
The Quantum Mechanics sequence - you won't get that in school.
Check out the Rationality: A to Z contents page, click on things that look interesting, it'll mostly work out. A Human's Guide to Words is really good exposition of philosophy. The subsequence of thinking about morality that I can point at with the post fake fake utility functions is good too. Or if you just want to learn what this rationality stuff is about, read the early posts about biases and read Knowing about biases can hurt people. That one's important - the point of knowing about biases is to see them in yourself. I just don't know what suits you, is all.
How about the Grad Student Advice Repository?

So probability of either Trump or Cruz is 100%?

No, ~83%

It's open source. Right now I only know very basic Python, but I'm taking a CS course this coming semester and I'm going for a minor in CS. How hard do you think it would be to add in other distributions, bounded values, etc.?

As a matter of programming it would be very easy. The difficult part is designing the user interface so that the availability of the options doesn't make the overall product worse.
Author is on the effective altruism forum, he said his next planned future is more distributions, and that he specifically architected it to be easy to add new distributions.
How hard will it be to add features depends on the way it's architected, but the real issue is complexity. After you add other distributions, bounds, etc. the user would have to figure out what are the right choices for his specific situation and that's a set of non-trivial decisions. Besides, one of the reasons people like normal distributions is that they are nicely tractable. If you want to, say, add two it's easy to do. But once you go to even slightly complicated things like truncated normals, a lot of operations do not have analytical solutions and you need to do stuff numerically and that becomes... complex and slow.
This is awesome. Awesome awesome awesome. I have been trying to code something like this for a long time but I've never got the hang of UI design.
Moderately. On the plus side it's forcing people to acknowledge the uncertainty involved in many numbers they use. On the minus side it's treating everything as a normal (Gaussian) distribution. That's a common default assumption, but it's not necessarily a good assumption. To start with an obvious problem, a lot of real-world values are bounded, but the normal distribution is not.
Everydayfeminism vibe, but for rationalists.

So it sounds like you're only disagreeing with the OP in degree. You agree with the OP that a lot of scientists should be learning more about cognitive biases, better statistics, epistemology, etc., just as we are trying to do on LW. You're just pointing out (I think) that the "informed laymen" of LW should have some humility because (a) in many cases (esp. for top scientists?) the scientists have indeed learned lots of rationality-relevant subject matter, perhaps more than most of us on LW, (b) domain expertise is usually more important than generic rationality, and (c) top scientists are very well educated and very smart.

Is that correct?

Yup! ---------------------------------------- edit: Although I should say LW "trying to learn better statistics" is too generous. There is a lot more "arguing on the internet" and a lot less "reading" happening.

In many cases I'd agree it's pretty crazy, especially if you're trying to go up against top scientists.

On the other hand, I've seen plenty of scientists and philosophers claim that their peers (or they themselves) could benefit from learning more about things like cognitive biases, statistics fallacies, philosophy of science, etc. I've even seen experts claim that a lot of their peers make elementary mistakes in these areas. So it's not that crazy to think that by studying these subjects you can have some advantages over some scientists, at least in some r... (read more)

Absolutely agree it is important for scientists to know about cognitive biases. Francis Bacon, the father of the empirical method, explicitly used cognitive biases (he called them "idols," and even classified them) as a justification for why the method was needed. I always said that Francis Bacon should be LW's patron saint.

I still haven't figured out what you have against Bayesian epistemology. It's not like this is some sort of LW invention - it's pretty standard in a lot of philosophical and scientific circles, and I've seen plenty of philosophers and scientists who call themselves Bayesians.

Solomonoff induction is one of those ideas that keeps circulating here, for reasons that escape me.

My understanding is that Solomonoff induction is usually appealed to as one of the more promising candidates for a formalization of Bayesian epistemology that uses objective and speci... (read more)

I don't have any problem with Bayesian epistemology at all. You can have whatever epistemology you want. What I do have a problem with is this "LW myopia" where people here think they have something important to tell to people like Ed Witten about how people like Ed Witten should be doing their business. This is basically insane, to me. This is strong evidence that the type of culture that gets produced here isn't particularly sanity producing. ---------------------------------------- Solomonoff induction is useless to know about for anyone who has real work to do (let's say with actual data, like physicists). What would people do with it?

and the funding

A Kickstarter, perhaps?

If it comes to that. I'm still figuring out what changes need to be made and what they'd take to make; one of them, for example, looks like it might just be flipping a config flag. Now, $1 to change and $99 to know where to change, but I know at least one person who's volunteered to donate some of that knowledge. (If you are interested in doing development work, either for pay or as a volunteer, please do reach out.)

Not sure what you mean by this. I actually posted the meeting for the Baltimore area myself.

The Baltimore and Washington DC meetups do show up if I click on "Nearest Meetups", just that they appear in the 5th and 8th spots. That list appears to be sorted first by date and then alphabetically. The San Antonio meetup appears at the #4 slot, and the Durham meetup does not appear at all.

Basically the "nearest" part of nearest meetups seems to be completely broken.

I'm from Baltimore, MD. We have a Baltimore meetup coming up Jan 3 and a Washington DC meetup this Sun Dec 13. So why do the two meetups listed in my "Nearest Meetups" sidebar include only a meetup in San Antonio for Dec 13 and a meetup in Durham NC for Sep 17 2026 (!)?

Whoever is running the meetup needs to make Meetup Posts for each meeting before they show up on the sidebar. IIRC regular meetups are often not posted there if the creator forgets about it. You can ask the person who runs the meetups to post them on LW more often or ask them if you can post them in their stead. I run the San Antonio meetup and you are very welcome to attend here if it's the nearest one to you!
For it at the moment it shows: It doesn't show the Berlin Meetup which is the city where I live and which I have put into my LW profile.
Man, that Durham date sure disconfirms the idea that your meetup isn't soon enough :) And hmm, just having one far-future meetup post is a clever way to just keep your meetup in the list permanently, like how groups have permanent pages, with the actual meetup schedule being a part of that group page.

See this article (full article available from sidebar), which argues that although conventional wisdom gives religion the advantage here, the reality may not be so clear-cut.

I'm an atheist. I imagine you are too. So, did you find that article because you, or whoever linked it to you, discovered it - or because you, or they, went looking for something which proved what you or they wanted to believe? ETA: I've long held that atheists should form social support groups, and take the other best-of aspects of religion. The article is arguing that the things which make religion beneficial have nothing to do with religion. This is untrue; religion is inherently a social activity. It is, by its nature, pro-social. If we're only grudgingly going to admit religion does anything right, that impairs our ability to figure out what it does right, and take those things for ourselves.

The point isn't that you don't do either.

Sorry, don't know what you mean to say here. Could you rephrase?

Your post is mainly talking about world in a non-probabilistic way.

Could you elaborate on what you mean?

To me it looks like the problem is belief in belief of logical positivism.

Again, could you elaborate? I don't see any reason to associate anything I've said with logical positivism.

The fact that you intuition is that you can't prove that you are not a Boltzmann brain, doesn't change that your intuition is that you aren't a Boltzmann brain

... (read more)
Let's start with Specifically, I keep getting the impression that most (all?) of the arguments for the ontology issues boil down to trusting philosophical intuitions and/or the way people use words. Something along the following lines. I keep getting the impression almost directly translates to My intuition tells me. You still base your chain of reasoning on it. Almost none of the reasoning in your post can be expressed by predicate logic and/or probability theory. While we are at it, it's worth noting that the intuition that probability obviously extends logic is doubtful. You don't see how the claim that everything is explainable with logic and probability theory relates to logical positivism? You choose a particular set of where to start that's highly culturally charged. Anna Wierzbicka argues for example in "Imprisoned in English" that it makes sense to start with fundamentals that nearly all human cultures agree on such as there being mothers and fathers but for example not brothers as some cultures have that concept while others don't. You start with the idea that complex concepts like mean, intuition, reason, associate, indicate, issue and evidence as all being fairly straightforward basics while Anna Wierzbicka would take neither of those as fundamentally basic. All of them are heavily charged with a lot of cultural associations that you likely hold unquestioned because you learned them as a child and never questioned them.

So if you are trying to work out which hypothesis is simpler, how do you do that? You use your personal intuition.

I was using Solomonoff Induction as an example of a system that uses Occamian priors. My question was on those who assert that they don't use Occamian priors at all, or for that matter any other type of objective prior. This usually seems to lead either to rejecting Bayesian epistemology in general or to asserting that any arbitrary prior works. I actually have no problem (in theory) rejecting Bayesian epistemology, as long as you still use ... (read more)

I think most people just hold things like faith and emotions higher than logic and probability. Asking, say "how do you know that murder is wrong?" would, I imagine, freak out some people who aren't philosophers. The whole idea that belief in god is a matter of probability is not held by many people, and moreso with moral questions. Most people, including intelleginet, educated people, do not seem to think that any justification for political opinions is needed except 'anyone who disagrees with me is evil/stupid'. Its actually worse than this - there are people who are deeply uncomfortable with having a notion of truth at all, because if there is a notion of truth, then some people are right and some people are wrong, and the idea that people might be wrong about something is offensive.
Your choice isn't "Statistically correct prior" versus "Arbitrary prior", your choice in the real world is between arbitrary priors and nothing at all.

Your post has only one instance of naming a probability and that's not 100%.

I meant when philosophers themselves claim they aren't looking at things in a probabilistic way. I actually had this conversation with my philosophy professor. He claimed that although he's comfortable talking about credences and probabilities, he's also comfortable talking about the world in a non-probabilistic way. This was one of those discussions where he didn't understand why I was so confused.

In a similar way you can argue that you don't have any evidence that you aren't

... (read more)
The point isn't that you don't do either. Your post is mainly talking about world in a non-probabilistic way. Given that's the case the professor with whom you are talking get's confused. To me it looks like the problem is belief in belief of logical positivism. The fact that you intuition is that you can't prove that you are not a Boltzmann brain, doesn't change that your intuition is that you aren't a Boltzmann brain. I intuition is that P!=NP but at the same time I'm certain that I don't have the mathetical skills to prove P!=NP. The fact that you don't have an intuitive mental distinction between "X is true" and "I can prove X is true" is a problem.

Nothing to do with IQ, but with modes of thinking. According to Nisbett, Eastern thinking is more holistic and concrete vs. the Western formal and abstract approach. He says that Easterners often make fewer thinking mistakes when dealing with other people, where a more holistic approach is needed (for example, Easterners are much less prone to the Fundamental Attribution Error). But at the same time they tend to make more thinking mistakes when it comes to thinking about scientific questions, as that often requires formal, abstract thinking. Nisbett also ... (read more)

I'm going to guess it's based on some of the East-West thinking differences outlined by Richard Nisbett in The Geography of Thought (I very highly recommend that book, BTW). I don't remember everything in the book, but I remember he had some stuff in there about why easterners are often less interested in, and have a harder time with, the sort of logical/scientific thinking that LW advocates.

I second the recommendation of The Geography of Thought.
Which is weird because, if you take seriously the ethnic-IQ correlation (which I don't), Asians show an higher-than-westerners average IQ.

Examples of obviously bad ideas: p-zombies, Platonism, Bayesian epistemology (the latter two may require explanation).

Could you provide that explanation?

Sure. If we take Platonism to be the belief that abstract objects (take, for instance, the objects of ZFC set theory) actually exist in a mind-independent way, if not in a particularly well-specified way, then it occurs because people mistake the contents of their mental models of the world for being real objects, simply because those models map the world well and compress sense-data well. In fact, those models often compress most sense-data better than the "more physicalist" truth would: they can be many orders of magnitude smaller (in bits of program devoted to generative or discriminative modelling). However, just because they're not "real" doesn't mean they don't causally interact with the real world! The point of a map is that it corresponds to the territory, so the point of an abstraction is that it corresponds to regularities in the territory. So naive nominalism isn't true either: the abstractions and what they abstract over are linked, so you really can't just move names around willy-nilly. In fact, some abstractions will do better or worse than others at capturing the regularities in sense-data (and in states of the world, of course), so we end up saying that abstractions can exist on a sliding scale from "more Platonic" (those which appear to capture regularities we've always seen in all our previous data) to "more nominalist" (those which capture spurious correlations). Now, for "Bayesian epistemology", I'm taking the Jaynesian view, which is considered extreme but stated very clearly and precisely, that reasoning consists in assigning probabilities to propositions. People who oppose Bayesianism will usually then raise the Problem of the Prior, and the problem of limited model classes, and so on and so forth. IMHO, the better criticism is simply: propositions are not first-order, actually-existing objects (see above on Platonism)! Consider a proposition to be a set of states some model can be in or not be in, and we can still use Bayesian statistics,

What makes a good primary care physician and how do I go about finding one?

Off the top of my head, the most reliable way would be to ask another senior medical professional - senior as they would tend to have been in the same geographic area for a while and know their colleagues, plus have more direct contact with primary care physicians. Also, rather than asking "who should i see as my primary care physician", you could ask "who would you send your family to see?". This might help prevent them from just recommending a friend/someone with whom they have a financial relationship. I note that this would be relatively hard to do unless you already know a senior medical professional. Another option would be to ask a medical student (if you happen to know any in your area) which primary care physicians teach at their university and they would recommend. Through my medical training I have found that teaching at a medical school to be weak-to-moderate evidence of being above average. Asking a medical student would help add a filter for avoiding some of the less competent ones, strengthening this evidence I think lay-people's opinions correlate much more strongly with how approachable and nice their doctor is, as opposed to competence. Doctor rating sites could be used just to select for pleasant ones, if you care about that aspect. (caveats: opinion based; my experience is limited to the country i trained in; I am junior in experience)
I don't have any surefire methods that don't require a very basic working knowledge of medicine, but a general rule of thumb is the physician's opinion of the algorithmic approach to medical decision making. If it is clearly negative, I'd be willing to bet that the physician is bad. Not quite the same as finding a good one, but decent for narrowing your search. Along with this, look for someone who thinks in terms of possibilities rather than certainties in diagnoses. All assuming you're looking for a general practitioner, of course. I wouldn't select surgeons based on this rule of thumb, for instance. If you're looking for someone who simply has good tableside manner, then reviews and word of mouth do work.
Ask everyone you know; ask for their recommendations, and ask why they make those recommendations. Most of the answers you get will not be worth much, but look for the good answers; you only need one. The trick here is that while it is nearly impossible to find the perfect doctor through any method, you are only looking for a good doctor. Any reasonable recommendation followed by a quick Google search (Google allows reviews on doctors, and most established doctors in larger cities will have at least one or two) to weed out the bad apples will do. This is one of those situations where the perfect is the enemy of productivity.
First of all, competence and skill. Just like everyone else, doctors vary in how good they are. Unfortunately, there is a popular meme (actively promulgated by the doctors guild) that all doctors are sufficiently competent so that any will do. That's... not true. Given this, it's shouldn't be surprising that finding out the particular doctor's competency ex ante is hard to impossible (unless s/he screwed up so hard, s/he ran into trouble with the law or the medical board). Typically you'll have to rely on proxies (e.g. the reputation of the medical school s/he went to). Beyond that, things start to depend on what do you need a doctor for. If you have a condition to be treated, you probably want a specialist in that (even primary care physicians have specializations). If you want to run a lot of tests on yourself, you want a doctor who's amenable to ordering whatever tests you ask him for. Etc., etc.
This is a great question, and I'm glad that you asked, since I am interested in hearing what people think about this as well. I suppose that word of mouth is generally superior to, say, just searching for a primary care doctor through your insurance provider's website, but I don't have any more specific ideas than that. Personally, I can, and often have, put off going to the doctor due to akrasia, so I put a bit of extra weight on how nice the doctor is-- having a nice doctor lowers the willpower-activation-energy needed for me to make an appointment. I also think that willingness to spend time with patients is important, but I'd be more likely to think this than the average person-- I'm pretty shy, so I'll often tell my doctors that I don't have any more questions (when I actually do) if they seem like they're in a hurry, so as to not bother them.

Is it still somewhat controversial? Meaning, are there respected physicists who think that conscious observers do magically cause things to happen?

Roger Penrose is very respected.

Thanks! Ok, so now a more detailed question:

As I said, I'd like to do formal epistemology. I'm an undergrad right now, and I need to decide on my major. If that's about all the formal stuff I'll need then there are a bunch of different majors that include that, and the question becomes which additional courses could help with formal epistemology or related disciplines.

Here's what I've come up with so far:

  • Choice 1: Applied Statistics. This allows several electives in other subjects, so I could do e.g. a minor in CS with only one or two extra course requir
... (read more)
"Math sophistication" is good, as is familiarity with basic stats and ML. In computer science depts., ML is often taught at the grad level, though. Specific major not so important. I found reading and doing proofs paid a lot of dividends.
I suppose modal logics of belief.

What areas of mathematics do I need to learn if I want to specialize in formal epistemology?

Linear algebra, function optimization, probability theory.


I'm having a hard time understanding the following article, from Ben Levenstein at FHI on the epistemology of disagreement. I know it's a bit long but it seems pretty important and I want to make sure I understand it correctly. It's just that I'm having a hard time following the math and formal notation. Can someone summarize it for me? Thanks.

Why do you say Carnegie Mellon? I'm assuming it's because they have the Center for Formal Epistemology and a very nice-looking degree program in Logic, Computation and Methodology. But don't some other universities have comparable programs?

Do you have direct experience with the Carnegie Mellon program? At one point I was seriously considering going there because of the logic & computation degree, and I might still consider it at some point in the future.

I mentioned CMU for the reasons you've stated and because Lukeprog endorsed their program once (no idea what evidence he had that I don't). I have also spoken to Katja Grace about it, and there is evidently a bit of interest in LW themes among the students there. I'm unaware of other programs of a similar caliber, though there are bound to be some. If anyone knows of any, by all means list them, that was the point of my original comment.
Confirmed, re: CMU phil. Email me for details (ilyas at, I know a few people there. I think Katja Grace went there at one point (?)

Ha, somehow missed that comment at the end. On the other hand, Bostrom only says EY named the problem. Did EY also come up with it?

The footnote implies that intermediate forms were not written down. Probably everyone involved has now forgotten who contributed what. Is it meaningful to ask about the problem? In what sense is Pascal's mugging different than Pascal's wager? That it only uses finite numbers? That it uses super-exponential growth and draws attention to Kolmogorov complexity? That the mugger's number comes second, allowing the threat/promise to depend on the probability? And, of course, the iconic difference, the one in the name, is that it is a single person making the claim, not a society, yet this is not a technical difference, but a purely psychological difference, and thus quite ambiguous and difficult to trace. At some level of granularity, every account is the same; at another, every account is different. Commenting on Yudkowsky's post, Bostrom cites his paper "Infinite Ethics" (section 4.3?). Presumably that is as close as Bostrom got in writing before. Whether you consider it "the same" or an admission of not having prior art depends on what you care about. Moreover, that paper has a bibliography, unlike the mugging dialogue.

Who came up with Pascal's Mugging? Both EY and Nick Bostrom (pdf) present it as seemingly their own idea.

No, Bostrom explicitly attributes it to Yudkowsky.

Possibly the most enthusiastic / impressive endorsement I've ever seen for a rationality-type book:

Every country should scrap a year or two of math education and require all citizens to read this book instead.

Jonathan Haidt praising Mindware: Tools for Smart Thinking by Richard Nisbett

Anybody read the book? Do you agree with Haidt?

It doesn't pass my first test for self-help books-- none of the amazon reviewers really said that following the advice made their life better. (One or two of the reviews might be interpreted that way, but they were marginal.) Admittedly, it's only been out for a month, so if should probably be given be given some more time/

Very interesting paper: Eric Schwitzgebel, 1% Skepticism. What's the probability that some form of radical skepticism is correct? And can that have any practical ramifications?

What's the best way to get free calibration training?

Regularly make predictions and record the results on

I find that the Orthodox Jewish system seems to work quite well, at least for religious most people I know. I grew up and married in that system, and I've never "dated" in the normal Western sense, so I have no idea how the system compares or might be applicable in the "normal" world.

[Note: There isn't really one Orthodox Judaism system. Different communities have very different systems, ranging from basically arranged marriages in many Hassidic communities, to almost-normal Western dating in Modern Orthodox communities. I grew up in wh... (read more)

Thanks to everybody who responded. I read all the comments and did some more thinking. I also found this PDF (Hebrew) of a speech he gave on the subject. Here's my summary of what I think he means, more or less:

Scientific statements are models of physical reality, but they're the map and not the territory. Religious statements are also models, but they're primarily maps of one's personal version of an aesthetic / emotional / moral system of "reality" rather than physical reality. If to experience the beauty of Judaism that means using a model tha... (read more)

Does he more or less say that religion only exists in a virtual reality, which he "believes" only because doing so is aesthetically pleasant? So, if I choose my "religion" to be e.g. Tolkien's canon, it is equally valid as Judaism, and any objections can only be made on the aesthetical level (such as "I find circumcision to be more cute than elves and orcs")?

So can you please explain what he means? I really don't understand in what sense it can be said that "the world is 15 billion years old" and "the world was created by God in six days" can both be literally true. And it doesn't sound like he means the Omphalos argument that the world was created looking old. Rather, it sounds like he's saying that in one sense of "truth" or in one "model of the world" it really is 15 billion years old, and in another sense / model it really is young, and those two truths / models are somehow not contradictory. I just can't seem to wrap my head around how that might make any sense.

The sentence "Frodo carried the One Ring to Mount Doom" is not literally true, but it is true within the fictional narrative of the Lord of the Rings. You can simultaneously believe it and not believe it, in a certain sense, by applying the so called "suspension of disbelief", a mental mechanism which probably evolved to allow us to consider hypothetical conterfactual beliefs for decision making and which we then started using to make fiction. I think that theists like Robert Aumann who support the non-overlapping magisteria position are doing something similar: they accept "the world is 15 billion years old" as an epistemic "Bayesian" belief which they use when considering expectations over observations, and they apply suspension of disbelief in order to believe "the world was created by God in six days" in the counterfactual context of religion.
World is created in 6 days, with evidence indicating 15 billion years of whatever. For that matter, earth was created one second ago, your memories included.
Where did you get young earth creationism from above? Where did you get "6 day creation is literally true given earth days"? If this is how you are parsing Aumann why are you even talking about this?
IHe's explaining the process of compartmentalization. I suspect if he had to bet on it for the background of a scientific fact, he would choose option A, but if he were discussing with a Rabi, he would choose to option B... he's reallly just choosing which compartment of belief to draw from.

On the subject of prosociality / wellbeing and religion, a recent article challenges the conventional wisdom by claiming that, depending on the particular situation, atheism might be just as good or even better for prosociality / wellbeing than religion is.

She means that you're biased towards the way you were taught vs. alternatives, regardless of the evidence. The example she gives (from G.A. Cohen) is that most Oxford grads tend to accept the analytic / synthetic distinction while most Harvard grads reject it.

Yes, I got that from reading the paper. However, the wording of the abstract seems quite sloppy; taken at face value it suggests that a person's education, K-postdoc (not to mention informal education) should have no influence on the person's philosophy. Moreover, the paper's point (illustrated by the Cohen example) is not really surprising; one's views on unanswered questions are apt to be influenced by the school of thought in which one was educated - were this not the case, the choice of what university to attend and which professor to study under would be somewhat arbitrary. Moreover, I don't think that she made a case that philosophers are ignoring the evidence, only that the philosopher's educational background continues to exert an influence throughout the philosopher's career. From a Bayesian standpoint this makes sense - loosely speaking, when the philosopher leaves graduate school, his/her education and life experience to that point constitute his/her priors, which he/she updates as new evidence becomes available. While the philosopher's priors are altered by evidence, they are not necessarily eliminated by evidence. This is not problematic unless overwhelming evidence one way or the other is available and ignored. The fact that whether or not to accept the analytic / synthetic distinction is still an open question suggests that no such overwhelming evidence exists - so I am not seeing a problem with the fact that Oxford grads and Harvard grads tend (on average) to disagree on this issue.

There's a new article on on potential biases amongst philosophers of religion: Irrelevant influences and philosophical practice: a qualitative study.


To what extent do factors such as upbringing and education shape our philosophical views? And if they do, does this cast doubt on the philosophical results we have obtained? This paper investigates irrelevant influences in philosophy through a qualitative survey on the personal beliefs and attitudes of philosophers of religion. In the light of these findings, I address two questions: an e

... (read more)
I would expect a person's education to shape his/her philosophical views; if one's philosophy is not shaped by one's education, then one has had a fairly superficial education.

The question is, how do I tell (without reading all the literature on the topic) if my argument is naive and the counterarguments that I haven't thought of are successful, or if my argument is valid and the counterarguments are just obfuscating the truth in increasingly complicated ways?

You either ask an expert, or become an expert. Although I'd be wary of philosophy experts, as there's not really a tight feedback loop in philosophy.

I happen to greatly enjoy Rosemary Sutcliff's historical novels. I'm not an expert on Roman or Anglo-Saxon cultures (that's where most of her novels are set), but as far as I can tell they're pretty accurate. They give a pretty good feel for what it must have actually been like to live back then.

Thank you. I think I saw her books in shops here.

I've wondered for a while now if we could do a Kickstarter and use the money to hire someone to upgrade the site or to implement some of the suggestions that people have been making.

I'm trying to figure out what percentage of a balanced investment portfolio should go towards rental real estate, but I'm having a hard time finding reliable sources of advice on this question.

I have a friend who invests in rental real estate, and he says he can give me a guaranteed 10% ROI if I invest $10,000+ with him, or 15% if I invest $100,000+. From looking around online this does indeed appear reasonable - rental real estate often gives much higher returns than this, so it sounds reasonable that he can guarantee a lower rate and then either pocket t... (read more)

Theoretically, the market portfolio, which is the efficient portfolio according to Modern Portfolio Theory should replicate the world's assets weighted by value. For America, household (and non-profit) net worth is ~$85T and the value of real estate holdings is ~$14T (value less mortgages) (source), so about 16% is pretty justifiable. This is all pretty back of the envelope though.
I don't believe it. If he could guarantee it, he would instead borrow from a bank at 3-5% as much as he can (potentially using his house as a collateral) and invest that. Besides, at that rate of return, other investors would flock in, including fund managers, and saturate the market.

he can give me a guaranteed 10% ROI

Heh. Ask him to actually guarantee it -- that is, structure the transaction as a loan yielding 10% (or 15%) with him fully liable for the principal and the interest. See if he agrees :-/ Don't forget to check that the counterparty (the borrower) has assets to pay you back.

There are financial securities called REITs (Real Estate Investment Trusts) which invest in property (sometimes commercial, sometimes rental, read the prospectus) and return the income to you less a haircut. As a sanity check you can take a look at how high returns do they provide.

Load More