I am a transhumanist and a futurist, but I've been depressive recently while thinking about the far future. This rarely happens. I found myself being scared of getting smarter due to a Singularity-like event. I was also scared by the arbitrariness of our goals and values. Simply put, I don't fit in to the present. I'm theorizing about intelligence, reading scientific papers, and participating very modestly in the brony fandom. I've made it my life's goal to make major steps towards safe AGI. Living to the point past that, I see aimlessness. Besides my one creative technological skill, I am mainly a consumer. That leads to my concern of getting smarter.

I mostly read stories, take in stories, participate in stories. Stories are my life. I want to be able to appreciate the stories we have now in the future. And I'm concerned that upgrading to transfuturist levels of intelligence will make the types of stories we have now incredibly banal and obvious, for many good reasons. Predictable, boring, and worthless.

It's not really a question, but I'd appreciate any other perspectives. Please?

Showing 3 of 11 replies (Click to show all)

At least you probably won't feel too bad about present-day stories no longer feeling compelling at a point where they no longer feel compelling. Maybe think about the horror of a three-year-old you at the idea of no longer liking their favorite picture-book, and your current feelings about you no longer finding the same picture-book very interesting. That's not quite enough though, people also probably won't feel bad about being wireheaded once they have been wireheaded, and they don't feel bad about being in a coma while they're in a coma...

A more positiv... (read more)

2Luke_A_Somers7yAs bronies, we already enjoy things that are pretty danged obvious. I mean, how many episodes could you not call the end of, at the 14 minute mark? That doesn't mean that there aren't good things about them. Obvious does not immediately lead to banal. If you're paying attention, you can predict nearly note for note the last quarter or so of lots of sonatas (hint: repeats used to be really popular). That doesn't make them banal any more than listening to them a second time does, or listening to them after you've already familiarized yourself with them. Another question - are you reading Friendship is Optimal and derived works? That's not Eutopia. CelestAI is a cosmic screwup.
0Pentashagon7yMost characters in stories suffer from the same problems current humans do. Why not write far better endings (and interludes for that matter) for all the stories? I think you are correct that we will not find the tragic tale of Romeo and Juliet anything but appalling in 100 years ("They died??!?"). So we'll probably fix them in ways that are quite satisfying to transhumans, just like we plan to fix the rest of reality. And there will almost certainly be new, better, amazing stories. Have you read the Fun Theory [http://lesswrong.com/lw/xy/the_fun_theory_sequence/] sequence? That deals directly with the question of what to do with potentially boundless time and space and intelligence.

More "Stupid" Questions

by NancyLebovitz 1 min read31st Jul 2013498 comments

14


This is a thread where people can ask questions that they would ordinarily feel embarrassed for not knowing the answer to. The previous "stupid" questions thread went to over 800 comments in two and a half weeks, so I think it's time for a new one.