I have a chronic health condition which results in significant brain fog, usually lasting a few hours most days. When I don’t have brain fog I can work on high-level technical work with relative ease, but when I’m experiencing symptoms, even simple tasks become a challenge (and manual labor is out of the question). As a result, most activities I’d like to do in the Effective Altruism space are out of reach for a good portion of my day. What are some ways to put that time to optimal use?

Beyond myself, it would be nice to have a list of low-cognitive-workload altruistic tasks to refer people to if they don’t have the intellectual capability to directly work on the alignment problem or whatever, but still want to help make the world a better place.

New Answer
Ask Related Question
New Comment

4 Answers sorted by

The lowest-hanging fruit here, which you may already have picked, is to spend your low-capacity time on tasks which improve the quality or quantity of your high-capacity time. It's hard to guess what tasks would even qualify as low-capacity for you -- you on a bad day might have an easy time with some particular thing that I would struggle with on a good day, and vice versa. For instance, it might be high-capacity to plan a menu and order groceries, but low-capacity to wash and chop all the vegetables you'll need for a weak of nutritious meals. It might be high-capacity to triage your closet and decide where to store each type of item, but low-capacity to fold a load of clean laundry and place each item in the spot with the corresponding label.

Budget some of your high-capacity time toward reducing the capacity required for inevitable tasks, so that you can offload them to lower-capacity times and free up more high-capacity time for the tasks you find most valuable. For instance, I labeled stuff around the house (big plates go here, small plates go there, bowls go there, this switch is for the outside lights and that switch next to it is for the inside ones) to benefit guests, but I've been amazed by how it reduces cognitive load for me as well. Store metadata about your life outside your head -- I find it helpful to try to treat my future self with at least the courtesy I would show to strangers. I try to code as if an intern will be attempting to understand it; I try to manage my physical workspace as neatly as I would want a public makerspace to be organized.

When you ask for "low-cognitive-workload altruistic tasks", how are you differentiating "altruistic tasks" from others? Is it altruistic work to feed, house, clothe, and generally facilitate the focus and productivity of a person who does "high-cognitive-workload altruistic tasks"? I would argue that any task with which the obviously altruistic work couldn't happen is itself altruistic. You are your own support system; give yourself credit for all the roles you occupy that make your "more important" work possible.

This is some really top-tier advice, thanks for responding! It’s hard to get myself to view self-serving tasks as being altruistic, but they really are, and that’s a personal bias I should work against.

Thanks, I'm glad you found it helpful! I remain quite curious how you define "altruism". Oxford offers two options: "the belief in or practice of disinterested and selfless concern for the well-being of others", or zoologically, "behavior of an animal that benefits another at its own expense". IMO, both these definitions gesture toward a style of behavior which does not actually optimize well-being in a utilitarian sense. These definitions suggest that when presented a choice between action A which improves the wellbeing of all parties, versus action B which improves the wellbeing of everyone but you, even if both actions create the same total increase in wellbeing as a first-order effect, action B was somehow "more altruistic". Realistically, in humans, the second-order effects of action A are likely to be pretty much all positive, whereas the second-order effects of action B involve everyone around you being exposed to and affected by your lower wellbeing. Someone who isn't taking care of themself will tend to treat those close to them less well; those who care about them will see and tend to worry which decreases those loved ones' wellbeing; those who feel some responsibility for the person or things they share with them will often inconvenience themselves to compensate for the person's failure to thrive. I'm sure you can think of plenty of examples of times when self-neglect by others near you affected you negatively. This is not even considering the fact that choosing an action that benefits everyone except the actor is going to tempt the actor's brain to resent those who benefitted, and then either their behavior will worsen due to that resentment or they'll experience increased cognitive/emotional load in suppressing the behavior changes, which expends energy that could have been put toward more positive purposes rather than expended on damage control. I suspect that the bias you mention could also be fed by cultural habits of portraying "women's work" l
100% agree with your take on altruism here. My intuitive definition of the term is something along the lines of “actions taken with the intention of increasing good and/or decreasing bad in the world,” regardless of outcome. For instance, someone who donates to charity, but the charity is secretly killing babies or something, is still doing an altruist action, albeit one with a tragic outcome. In short, what matters is intentionality, imo. If I do something purely good for myself, that is an altruistic act if and only if I am doing it with the intentionality of making the world a better place. Since my subjective happiness is part of the world, one might argue that any act of self-pleasure should be considered altruistic. That doesn’t seem like the typical understanding of altruism, however, as I imagine most would only consider a “selfish” act to be altruistic if done with the intentionality that the benefit I receive will ultimately result in improving the lives of others, external to me. Ultimately the prior view does seem more appealing to me however, since it seems silly to arbitrarily define oneself as not being part of the world. [EDIT: note that I may have written the above comment after taking sleeping pills, and as such, it may or may not be comprehensible. Sorry about that!]
I think it makes sense! It follows easily from your definition that actions which increase your capacity to do good, without doing harm themselves, align with your value system. To bring it full circle, I'll bet your high-capacity and low-capacity selves can find all kinds of ways that the latter can make the former even more powerful.

One option is writing or editing questions and answers for Stampy, the AI alignment FAQ Rob Miles's community is building.

We're building the reader UI (very early prototype, missing search, automatically adding new questions as you click, and prettiness) and bot interface.

You may also want to consider opportunities on the EA Volunteer Job Board. Some of them are similar low effort wiki building.


Regarding that airtable, I don't know who manages it but it needs a date posted field on each post because you can't tell what is new or what is old (and these things take time to apply to, which could be totally wasted if they're all years old). Better yet would be a date posted and date closed field so people know when they can apply.

I don't have anything shovel-ready for this, but I propose as a general pattern data gathering. By this I don't mean doing research but just actual yeoman's tasks of finding the data, collating it, and putting it in suitable format for easy access.

An example of a case where this lead to something interesting when the researcher themselves did it is Mandelbrot's Fractal Markets Hypothesis, where Mandelbrot personally collated huge quantities of market data to notice the pattern. There are also a few examples in 12 Things I Learned Studying Nature's Laws.

Maybe doing labelling for Redwood would be a good example of this? 

That is a great low-cognitive-energy task—one thing I’ve done in the past is simply add images and short descriptions to Wikipedia pages missing them, which hopefully helps improve reader retention.

New to LessWrong?