Completely agreed. That's why some subs only do +, no -. I cannot defend the current system. ;-)
I think you underrate the existential risks that come along with substantial genetic or neurological enhancements.
It's true, I absolutely do. It irritates me. I guess this is because the ethics seem obvious to me: of course we should prevent people from developing a "supervirus" or whatever, just as we try to prevent people from developing nuclear arms or chemical weapons. But steering towards a possibly better humanity (or other sentient species) just seems worth the risk to me when the alternative is remaining the violent apes we are. (I kn...
The core question is: "What kind of impact do you expect to make if you work on either issue?"
Do you think there work to be done in the space of solar power development that other people than yourself aren't effectively doing? Do you think there work to be done in terms of better judgment and decision-making that other people aren't already doing?
I'm familiar with questions like these (specifically, from 80000 hours), and I think it's fair to say that I probably wouldn't make a substantive contribution to any field, those included. Given that ...
I agree. Getting downvoted feels bad man, no matter the reason.
Since people were pretty encouraging about the quest to do one's part to help humanity, I have a follow-up question. (Hope it's okay to post twice on the same open thread...)
Perhaps this is a false dichotomy. If so, just let me know. I'm basically wondering if it's more worthwhile to work on transitioning to alternative/renewable energy sources (i.e. we need to develop solar power or whatever else before all the oil and coal run out, and to avoid any potential disastrous climate change effects) or to work on changing human nature itself to better address ...
Yeah, that accurately describes their effect on me.
I used to be on Buproprion, but it had unpleasant physical effects on me (i.e. heart racing/pounding, which makes sense, given that it's stimulant-like) without any noticeable mood effects. I was quite disappointed, since a friend of mine said he practically had a manic episode on it. However, I took it conjunction with an SNRI, so maybe that wouldn't have happened if I'd just taken it on its own.... Idk.
I'm actually surprised my psychiatrist hasn't recommended an MAOI to me in that case, since she freaks the hell out when I say I'm suicidal, and I've done so twice. I'll put MAOIs at the bottom of my aforementioned new to-do list. :)
Huh, interesting. Up-managing one's doctor seems frowned upon in our society -- since it usually comes in the form of asking one's doctor for medications mentioned in commercials -- but obviously your approach seems much more valid. Kind of irritating, though, that doctors don't appear to really be doing their job. :P
The exchange here has made me realize that I've actually been skipping my meds too often. Heh.... :\ So if I simply tighten that up, I will effectively increase my dosage. But if that doesn't prove to be enough, I'll go the route you've suggested. Thanks! :)
Ah, thanks. :)
EA? (Sorry to ask, but it's not in the Less Wrong jargon glossary and I haven't been here in a while.)
Parasite removal refers to removing literal parasites from people in the third world
Oh. Yes. I think that's important too, and it actually pulls on my heart strings much more than existential risks that are potentially far in the future, but I would like to try to avoid hyperbolic discounting and try to focus on the most important issue facing humanity sans cognitive bias. But since human motivation isn't flawless, I may end up focusing on something more immediate. Not sure yet.
How cool, I've never heard of CFAR before. It looks awesome. I don't think I'm capable of making a lot of money, but I'll certainly look into CFAR.
Edit: I just realized that CFAR's logo is at the top of the site. Just never looked into it. I am not a smart man.
people basing morality on fiction.
Yes, and that seems truly damaging. I get the need to create conflict in fiction, but it seems to come always at the expense of technological progress, in a way I've never really understood. When I read Brave New World, I genuinely thought it truly was a "brave new world." So what if some guy was conceived naturally?? Why is that inherently superior? Sounds like status quo bias, if you ask me. Buncha Luddite propraganda.
I've actually been working on a pro-technology, anti-Luddite text-based game. Maybe working on it is in fact a good idea towards balancing out the propaganda and changing public opinion...
True, true. But it's still hard for me (and most people?) to circumvent that effect, even while I'm aware of it. I know Mother Theresa actually had a technique for it (to just think of one child rather than the millions in need). I guess I can try that. Any other suggestions?
Also, would you still want to save a drowning dog even if it might bite you out of fear and misunderstanding? (let's say it is a small dog and a bite would not be drastically injurious)
I'll pretend it's a cat since I don't ...
Well, true. All things shall pass.
Serious, non-rhetorical question: what's the basis of your preference? Anything more than just affinity for your species?
I'm not 100% sure what you mean by parasite removal... I guess you're referring to bad decision-makers, or bad decision-making processes? If so, I think existential risks are interlinked with parasite removal: the latter causes or at least hastens the former. Therefore, to truly address existential risks, you need to address parasite removal.
You're of course correct. I'm tempted to question the use of "better" (i.e. it's a matter of values and opinion as to whether its "better" if humanity wipes itself out or not), but I think it's pretty fair to assume (as I believe utilitarians do) that less suffering is better, and theoretically less suffering would result from better decision-making and possibly from less climate change.
Thanks for this.
I assume you're talking about the facepalm-inducing decision-making? If so, that's a pretty morbid fascination. ;-)
Oh yeah, I'm not saying Spivak's Calculus doesn't provide good training in proofs. I really didn't even get far enough to tell whether it did or not, in which case, feel free to disregard my comment as uninformed. But to be more specific about my "not liking", I just found the part I did read to be more opaque than engaging or intriguing, as I've found other texts (like Strang's Linear Algebra, for instance).
Edit: Also, I'm specifically responding to statements that I thought referring to liking the book in the enjoyment sense (expressed on this...
lol yeah, I know what you're talking about.
Okay okay, fine. ;-)
Yes. :)
But...
;-)
You should check out my response to one of the other comments--I think it's even more "yes, but"! I kind of see what you mean, but it sounds to me like just a way of saying "believe x or else" instead of giving an actual argument.
However, the ultimate conclusion is, I guess, just getting back on the horse and doing whatever I can to treat the dysthymia. I'm just like... ugh. :P But that's not very rational.
Thanks for the feedback.
It's a start, and potentially fewer side effects is always good, but think of it this way: who's going to gravitate towards rationality training? I would bet people who are already more rational than not (because it's irrational not to want to be more rational). Since participants are self-selected, a massive part of the population isn't going to bother with that stuff. There are similar issues ... (read more)