Completely agreed. That's why some subs only do +, no -. I cannot defend the current system. ;-)
I think you underrate the existential risks that come along with substantial genetic or neurological enhancements.
It's true, I absolutely do. It irritates me. I guess this is because the ethics seem obvious to me: of course we should prevent people from developing a "supervirus" or whatever, just as we try to prevent people from developing nuclear arms or chemical weapons. But steering towards a possibly better humanity (or other sentient species) just seems worth the risk to me when the alternative is remaining the violent apes we are. (I know we're hominds, not apes; it's just a figure of speech.)
When it comes to running out of fossil fuels we seem to do quite well. Solar energy halves costs every 7 years.
That's certainly a reassuring statistic, but a less reassuring one is that solar power currently supplies less than one percent of global energy usage!! Changing that (and especially changing that quickly) will be an ENORMOUS undertaking, and there are many disheartening roadblocks in the way (utility companies, lack of government will, etc.). The fact that solar itself is getting less expensive is great, but unfortunately the changing over from fossil fuels to solar (e.g. phasing out old power plants and building brand new ones) is still incredibly expensive.
The core question is: "What kind of impact do you expect to make if you work on either issue?"
Do you think there work to be done in the space of solar power development that other people than yourself aren't effectively doing? Do you think there work to be done in terms of better judgment and decision-making that other people aren't already doing?
I'm familiar with questions like these (specifically, from 80000 hours), and I think it's fair to say that I probably wouldn't make a substantive contribution to any field, those included. Given that likelihood, I'm really just trying to determine what I feel is most important so I can feel like I'm working on something important, even if I only end up taking a job over someone else who could have done it equally well.
That said, I would hope to locate a "gap" where something was not being done that should be, and then try to fill that gap, such as volunteering my time for something. But there's no basis for me to surmise at this point which issue I would be able to contribute more to (for instance, I'm not a solar engineer).
To me it seems much more effective to focus on more cognitive issues when you want to improve human judgment. Developing training to help people calibrate themselves against uncertainty seems to have a much higher return than trying to do fMRI studies or brain implants.
At the moment, yes, but it seems like it has limited potential. I think of it a bit like bootstrapping: a judgment-impaired person (or an entire society) will likely make errors in determining how to improve their judgment, and the improvement seems slight and temporary compared to more fundamental, permanent changes in neurochemistry. I also think of it a bit like people's attempts to lose weight and stay fit. Yes, there are a lot of cognitive and behavioral changes people can make to facilitate that, but for many (most?) people, it remains a constant struggle -- one that many people are losing. But if we could hack things like that, "temptation" or "slipping" wouldn't be an issue.
The problem with coal isn't that it's going to run out but that it kills hundred of thousands of people via pollution and that it creates climate change.
From what I've gathered from my reading, the jury is kind of out on how disastrous climate change is going to be. Estimates seem to range from catastrophic to even slightly beneficial. You seem to think it will definitely be catastrophic. What have you come across that is certain about this?
Since people were pretty encouraging about the quest to do one's part to help humanity, I have a follow-up question. (Hope it's okay to post twice on the same open thread...)
Perhaps this is a false dichotomy. If so, just let me know. I'm basically wondering if it's more worthwhile to work on transitioning to alternative/renewable energy sources (i.e. we need to develop solar power or whatever else before all the oil and coal run out, and to avoid any potential disastrous climate change effects) or to work on changing human nature itself to better address the aforementioned energy problem in terms of better judgment and decision-making. Basically, it seems like humanity may destroy itself (if not via climate change, then something else) if it doesn't first address its deficiencies.
However, since energy/climate issues seem pretty pressing and changing human judgment is almost purely speculative (I know CFAR is working on that sort of thing, but I'm talking about more genetic or neurological changes), civilization may become too unstable before it can take advantage from any gains from cognitive enhancement and such.On the other hand, climate change/energy issues may not end up being that big of a deal, so it's better to just focus on improving humanity to address other horrible issues as well, like inequality, psychopathic behavior, etc.
Of course, society as a whole should (and does) work on both of these things. But one individual can really only pick one to make a sizable impact -- or at the very least, one at a time. Which do you guys think may be more effective to work on?
[NOTE: I'm perfectly willing to admit that I may be completely wrong about climate change and energy issues, and that collective human judgment is in fact as good as it needs to be, and so I'm worrying about nothing and can rest easy donating to malaria charities or whatever.]
Yeah, that accurately describes their effect on me.
I used to be on Buproprion, but it had unpleasant physical effects on me (i.e. heart racing/pounding, which makes sense, given that it's stimulant-like) without any noticeable mood effects. I was quite disappointed, since a friend of mine said he practically had a manic episode on it. However, I took it conjunction with an SNRI, so maybe that wouldn't have happened if I'd just taken it on its own.... Idk.
I'm actually surprised my psychiatrist hasn't recommended an MAOI to me in that case, since she freaks the hell out when I say I'm suicidal, and I've done so twice. I'll put MAOIs at the bottom of my aforementioned new to-do list. :)
Huh, interesting. Up-managing one's doctor seems frowned upon in our society -- since it usually comes in the form of asking one's doctor for medications mentioned in commercials -- but obviously your approach seems much more valid. Kind of irritating, though, that doctors don't appear to really be doing their job. :P
The exchange here has made me realize that I've actually been skipping my meds too often. Heh.... :\ So if I simply tighten that up, I will effectively increase my dosage. But if that doesn't prove to be enough, I'll go the route you've suggested. Thanks! :)
EA? (Sorry to ask, but it's not in the Less Wrong jargon glossary and I haven't been here in a while.)
Parasite removal refers to removing literal parasites from people in the third world
Oh. Yes. I think that's important too, and it actually pulls on my heart strings much more than existential risks that are potentially far in the future, but I would like to try to avoid hyperbolic discounting and try to focus on the most important issue facing humanity sans cognitive bias. But since human motivation isn't flawless, I may end up focusing on something more immediate. Not sure yet.
It's a start, and potentially fewer side effects is always good, but think of it this way: who's going to gravitate towards rationality training? I would bet people who are already more rational than not (because it's irrational not to want to be more rational). Since participants are self-selected, a massive part of the population isn't going to bother with that stuff. There are similar issues with genetic and neurological modifications (e.g. they'll be expensive, at least initially, and therefore restricted to a small pool of wealthy people), but given the advantages over things like CFAR I've already mentioned, it seems like it'd be worth it...
I have another issue with CFAR in particular that I'm reluctant to mention here for fear of causing a shit-storm, but since it's buried in this thread, hopefully it'll be okay. Admittedly, I only looked at their website rather than actually attending a workshop, but it seems kind of creepy and culty--rather reminiscent of Landmark, for reasons not the least of which is the fact that it's ludicrously, prohibitively expensive (yes, I know they have "fellowships," but surely not that many. And you have to use and pay for their lodgings? wtf?). It's suggestive of mind control in the brainwashing sense rather than rationality. (Frankly, I find that this forum can get that way too, complete with shaming thought-stopping techniques (e.g. "That's irrational!"). Do you (or anyone else) have any evidence to the contrary? (I know this is a little off-topic from my question -- I could potentially create a workshop that I don't find culty -- but since CFAR is currently what's out there, I figure it's relevant enough.)
You could be right, but I think that's rather optimistic. This blog post speaks to the problems behind this argument pretty well, I think. Its basic gist is that the amount of energy it will take to build sufficient renewable energy systems demands sacrificing a portion of the economy as is, to a point that no politician (let alone the free market) is going to support.
This brings me to your next point about addressing politics instead of neurology. Have you ever tried to get anything changed politically...? I've been involved in a couple of movements, and my god is it discouraging. You may as well try to knock a brick wall down with a feather. It basically seems that humanity is just going to be the way it is until it is changed on a fundamental level. Yes, I know society has changed in many ways already, but there are many undesirable traits that seem pretty constant, particularly war and inequality.
As for solar as opposed to other technologies, I am a bit torn as to whether it might be better to work on developing technologies rather than whatever seems most practical now. Fusion, for instance, if it's actually possible, would be incredible. I guess I feel that working on whatever's practical now is better for me, personally, to expend energy on since everything else is so speculative. Sort of like triage.