All of nazgulnarsil3's Comments + Replies

the likely result is that pundits would start taking more care to make their predictions untestable.

this is already the norm 1) make qualitative prediction 2) reject criticism with "no true scotsman" fallacy (x wasn't really an example of y because z)

the general public being unaware of the fact that stock prices are an equilibrium of beliefs about whether the stock will rise or fall is not a major cause for concern.

AA is willing to pay in order to achieve a more egalitarian outcome. in other words: AA is willing to pay money in order to force others to be more like him.

a desire to change the payoff matrix itself is my point: one monkey gets the banana and the other monkey cries justice. justice is formalized fairness. I can easily envision that AA would also pay in order to alter the payoff matrix.

So let's set up another trial of this with an added meta dilemma: in each case the disadvantaged member of the trial can forfeit another 5 points in order to alter the pa... (read more)

god damn communists. always on about income inequality instead of trying to maximize the amount everyone gets. I always refer to Mind the Gap by Paul Graham in these cases.

Richard that can be described as near/far. also I'm not sure the cynic/idealist is the correct dichotomy, as cynicism seems a form of idealism. idealist/realist optimist/cynic ?

I regretted posting the original comment immediately but felt like your comment "maybe this is why africa stays poor" was kind of a pandora's box for this sort of thing.

all discussions lead inexorably towards ever more fundamental issues until eventually you're talking about axiomatic beliefs. This seem to fall in line with the idea that either you have different priors or one of you has made a mistake. Since this is a community of intelligent commenters it follows that most disagreements are probably due to different core values/assumptions.

Bu... (read more)

it's disingenuous to blame NASA, as if we couldn't afford both!

the point here is that the money that the government spends is 100% wasted on these things, not that we should find ways to pay for more stuff. I don't support government spending at all. when I talk about environmentalism I'm talking about the government whipping people into a frenzy in order to justify ridiculous schemes that private enterprise would never support. If there was less taxation and people were rational about picking charities to reduce overall suffering micronutrient and clea... (read more)

Do you feel the same indignation toward spending on, say, NASA?

of course. environmentalism is just the latest in a long string of justifications for government subsidy. NASA is another great example of breathtaking levels of waste.

A big part of the reason Africa stays poor is because nutrition and education is so poor that sub-saharan IQ's average about 70. Environmentalism pisses me off because for a fraction of what we are spending on the public hysteria we could be providing micro nutrients that would lead to huge decreases in overall suffering. Ditto with providing clean water.

What the hell is green tech? Is it just more efficient tech? Or does it have less to do with the technology and more to do with economic agents acknowledging externalities, consciously choosing to internalize some of that cost?

I'll take that as an analogy for what it means to be a moral person. (It's another way of talking about Kant's Categorical Imperative.)

terrifying freedom

I believe this is one of the prime motivators for religion, conspiracy theories, and all other manner of hidden organization schemes. the thought that this is literally IT and no one will judge the wicked, no one is guiding the leviathan, no one will care if you make a stupid mistake and it costs you your life.

"The cold, suffocating dark goes on forever and we are alone. Live our lives, lacking anything better to do. Devise reason later. Born from oblivion; bear children, hell-bound as ourselves, go into oblivion. There is nothing e... (read more)

Huh, I was unaware that the whole concept of spandrels had originated with Gould. Point taken, one can reinterpret seemingly random noise as being itself an adaptation that overcomes simple hill climbing perhaps. Mutations themselves are a random walk, selection is not random. Environment acts as a hill, organisms as hill climbing algorithms, with the top of the hill being maximally efficient use of resources for reproduction. Is this correct?

we have X because it increased inclusive genetic fitness, full stop.

if evolutionary psychologists actually believe this it is a good example of why they aren't taken very seriously. what about spandrels?

yes, the easiest way to spot scientism is to look for value statements being conflated with factual statements. This is done unintentionally in many cases, the persuaders can't help it because they can't distinguish between the two. 1) you falsify the data that someone thought was factual that they used to support their values. They take this as an attack on said values. 2) you point out errors in the train of logic between factual statements and values, and/or point out that there is no valid logic train between their values and facts. 3) you make a fac... (read more)

three worlds collide would make a decent movie...just have to make the reasoning of the characters more explicit for people unfamiliar with concepts involved.

scientists fight over the division of money that has been block-allocated by governments and foundations. I should write about this later.

yes you should. this is a very serious issue. in art the artist caters to his patron. the more I see of the world of research in the U.S. the more I am disturbed by the common source of the vast majority of funding. science is being tailored and politicized.

if the SHs find humans via another colony world blowing up earth is still an option. I don't believe the SHs could have been bargained with. They showed no inclination towards compromise in any other sense than whichever one they have calculated as optimal based on their understanding of humans and babyeaters. Because the SHs don't seem to value the freedom to make sub-optimal choices (free will) they may also worry much less about making incorrect choices based on imperfect information (this is the only rational reason I can come up with for them wantin... (read more)

keeping the signal to noise ratio in a community is easy. Just make sure to wright long detailed posts about obtuse subjects (we have that covered) and don't respond to trolls. Any commoner that stumbles upon it will get bored and leave. This seems to have worked with Hacker News so far.

with regards to the Steve Jobs Quote: Democracy is the theory that the common people know what they want and deserve to get it good and hard. - H.L. Mencken

but an eden with a reversible escape option is surely better than an eden with a non-reversible escape option yes?

Most religions believe that the escape option is reversible - otherwise there wouldn't be much point.

ZM: I'm not saying that the outcome wouldn't be bad from the perspective of current values, I'm saying that it would serve to lessen the blow of sudden transition. The knowledge that they can get back together again in a couple decades seems like it would placate most. And I disagree that people would cease wanting to see each other. They might prefer their new environment, but they would still want to visit each other. Even if Food A tastes better in every dimension to Food B I'll probably want to eat Food B every once in awhile.

James: Considering the... (read more)

am I missing something here? What is bad about this scenario? the genie himself said it will only be a few decades before women and men can be reunited if they choose. what's a few decades?

A few decades with superstimulus-women around for the men, and superstimulus-men for the women? I don't expect that reunification to happen.

Although that doesn't in any way say that there's anything bad about this scenario. cough

EDIT: it would be bad if they didn't manage to get rid of the genie; then humanity would be stuck in this optimised-but-not-optimal state forever. As it is, it's a step forward if only because people won't age any more.

This story would be more disturbing if the 90% threshold was in fact never reached, as more and more people chang... (read more)

rw: methods of short circuiting the sex drive falls into two categories. the first would be controlling sensory input (holodecks/virtual reality and or cyborgs). the second is bypassing the senses and directly messing with the brain itself via implants or genetic manipulation.

the second type is more prone to unintended consequences than the first.

Our drive to do better than our neighbor is a deeply ingrained metric of how we judge ourselves. In essence we recognize that our own assessment is biased and look for cues from others. Eliminating this seems like eliminating past of the foundation of a social species.

I think you're being remarkably binary about this. I think it more realistic that non-sentient sexdroids will enable healthier relationships. When people get the urge to procreate with fitter partners they can just spend an afternoon in the holodeck. I see what you're saying as advocating keeping people a little hungry so that they appreciate food more.

I thought a big part of the appeal of the super villain fantasy wasn't your standard of living but in comparative standard of living. It's boring if everyone has a volcano lair. People want a doomsday weapon so that they are feared and respected.

I don't know. I think that a major appeal of Minecraft is the pressure toward making volcano lairs. (in a game where many people can have them). It may have some aspects of 'I have a cooler house than you', but I don't think it's JUST about that. For example, note how hard it can be to be creative with your living space unless you have a ton of money, and how much McMansions are loathed by people who would prefer one to their own house on strictly hedonic issues.

an investment earning 2% annual interest for 12,000 years adds up to a googol (10^100) times as much wealth.

no it adds up to a googol of economic units. in all likelihood the actual wealth that the investment represents will stay roughly the same or grow and shrink within fairly small margins.

it seems you conclude with an either/or on subjective experience improvement and brain tinkering. I think it more likely that we will improve our subjective experience up to a certain point of feasibility and then start with the brain tinkering. Some will clock-out... (read more)

so would you be for or against an AI that inserted us into an experience machine programmed to provide a life of maximum self expression without our knowledge?

the value of this is most easily demonstrated in daydream scenarios. I'm guessing that other people, like me, find themselves going through some of the same fantasies time and time again, whether they be about wealth, sex, prestige or whatever else. A few days ago I banished all these familiar fantasies and spent some time thinking up new ones. Not only was it a wonderfully fun exercise, it seemed to increase my creativity when doing other activities throughout the day.

the difference between reality and this hypothetical scenario is where control resides. I take no issue with the decentralized future roulette we are playing when we have this or that kid with this or that person. all my study of economics and natural selection indicates that such decentralized methods are self-correcting. in this scenario we approach the point where the future cone could have this or that bit snuffed by the decision of a singleton (or a functional equivalent), advocating that this sort of thing be slowed down so that we can weigh the decisions carefully seems prudent. isn't this sort of the main thrust of the friendly AI debate?

what effect would it have on the point

if rewinding is morally unacceptable (erasing could-have-been sentients) and you have unlimited power to direct the future, does this mean that all the could-have-beens from futures you didn't select are on your shoulders? This is directly related to another recent post. If I choose a future with less sentients who have a higher standard of living am I responsible for the sentients that would have existed in a future where I chose to let a higher number of them be created? If you're a utilitarian this is the delicate... (read more)

No, the theft problem is much easier than the aggregate problem. If the only thing in our power to change is the one man's behavior, we probably would allow the man to steal. It's worse to let his family die. But if we start trying to let everyone steal whenever they can't afford things, this would collapse our economy and soon mean there weren't enough goods to even steal. So if it's within our power to change the whole system, we wouldn't let the man steal---instead we would eliminate poverty so that no one ever has to steal. This is obviously the optimal long-run large-scale decision, and the trick is really getting there from here (the goal is essentially undisputed). The aggregate problem is a whole lot harder, because the goals themselves are in dispute. Which world is better, a world of 1,000 ultimately happy people, or a world of 1 billion people whose lives are just barely worth living?

Actually it sounds pretty unlikely to me, considering the laws of thermodynamics as far as I know them.

you can make entropy run in reverse in one area as long as a compensating amount of entropy is generated somewhere within the system. what do you think a refrigerator is? what if the extra entropy that needs to be generated in order to rewind is shunted off to some distant corner of the universe that doesn't affect the area you are worried about? I'm not talking about literally making time go in reverse. You can achieve what is functionally the same thing by reversing all the atomic reactions within a volume and shunting the entropy generated by the energy you used to do this to some other area.

I think it's worth noting that truly unlimited power means being able to undo anything. But is it wrong to rewind when things go south? if you rewind far enough you'll be erasing lives and conjuring up new different ones. Is rewinding back to before an AI explodes into a zillion copies morally equivalent to destroying them in this direction of time? unlimited power is unlimited ability to direct the future. Are the lives on every path you don't choose "on your shoulders" so to speak?

I'm pretty sure that "rewinding" is different to choosing now not to create lives.
I often think about a rewound reality, where the only difference is the data in my brain... and the biggest problem I have with this is all the people that are born after the time I'd go back to that I don't want to unmake. Of course, my attention span is terrible, so I never follow one of these long enough or thorough enough to simulate how I'd try to avert such issues... then chaos theory would screw it up in spite of all that. The point is that I concur.
A superintelligent AI doesn't have truly unlimited power. It can't even violate the laws of physics, let alone morality. If your moral system says that death is inherently bad, then undoing the creation of a child is bad.
It does seem intuitively right to say that killing something already existing is worse than not creating it in the first place. (Though, formalizing this intuition is murder. Literally.)

Or should we be content to have the galaxy be 0.1% eudaimonia and 99.9% cheesecake?

given that the vast majority of possible futures are significantly worse than this, I would be pretty happy with this outcome. but what happens when we've filled the universe? much like the board game risk, your attitude towards your so called allies will abruptly change once the two of you are the only ones left.

If the universe is open, we won't ever run out of space! The infinite future and infinite space raise plenty of other problems of their own, but I think it's interesting that they actually do solve this one.

Peter: if your change of utility functions is of domain rather than degree you can't calculate the negative utility. the difference in utility between making 25 paperclips a day and 500 a day is a calculable difference for a paperclip maximizing optimization process.

however, if the paperclip optimizer self-modifies and inadvertently changes his utility function to maximizing staples....well you can't calculate paperclips in terms of staples. This outcome is of infinite negative utility from the perspective of the paperclip maximizer. And vice-versa. On... (read more)

I should have specified a domain change. a modification that varies your utility function by degree has a calculable negative utility.

I think that an empirical approach self modification would quickly become prominent. alter one variable and test it, with a self imposed timeout clause. the problem is that this does not apply to one sort of change: a change in utility function. an inadvertent change of utility function is extremely dangerous, because changing your utility function is of infinite negative utility by the standards of your current utility, and vice-versa.

frelkins: in that vein what if we could flip the switch in the brain that usually only flips when you are sleeping with a new partner? isn't this half of humanties sex problems gone in one shot? it seems to me that the realm of sex is the one in which it is most obvious that desires shaped by natural selection are not in line with actual happiness and fulfillment.

caledonian: I agree. if we develop some sort of virtual reality that can provide any desire, we'll just be selecting for people who don't go in and never come out. If so the future will be populated by people who refuse such self gratification.

what's more fun? a holodeck that you have complete control over? or a holodeck with built in constraints?

playing god might be fun for awhile, but I think everyone would eventually switch over to programs with built in constraints to challenge themselves. the profession of highest prestige will probably people who write really really good holodeck programs.

yeah, I did. Only because I see political machinations as far more dangerous than the problems happiness studies solve.

as a preference utilitarian I dislike happiness studies. they're much too easy to use as justification for social engineering schemes.

Shuman hmm true. alright. fission reactor with enough uranium to power everything for several lifetimes (whatever my lifetime is at that point) and accelerate the asteroid up to relativistic speeds. aim the ship out of the galactic plane. the energy required to catch up with me will make it unprofitable to do so.

Carl Shuman that is why I will create a solar powered holodeck with built in replicator, and launch myself into deep space attached to an asteroid with enough elements for the replicator.

rest of humanity can go to hell.

maximized freedom with the constraint of zero violence. violence will always exist as long as there is scarcity, so holodecks + replicators will save humanity.

the most important adaptation an ideology can make to improve its inclusive fitness for consumption by the human brain is to

  1. refrain from making falsifiable claims
  2. convince its followers to aggressively expand

1 is accomplished by making the ideology rest on a priori claims. everything that rests on top of that claim can be perfectly logical given the premise. since most people don't examine their beliefs axiomatically, few will question the premise as long as they are provided the bare minimum of comfort. 2 is accomplished by activating the "mor... (read more)

I like to think of life as being a Pe^rt equation. P = you and your skills r = your investment ability/luck t = invest early, take advantage of tax laws

I hope I live to see a world where synchronous computing is considered a quaint artifact of the dawn of computers. cognitive bias has prevented us from seeing the full of extent of what can be done with this computing thing. a limit on feasible computability (limited by our own brain capacity) that has existed for all the millions of years, shaping the way we assume we can solve problems in our world, is suddenly gone. we've made remarkable progress in a short time, I can't wait to see what happens next.

in the course of natural selection, conformity to social values took on a much higher priority than the truth. especially for women who are vulnerable and must adapt to please whichever males are in charge at the time. confronting the average person with the truth is a waste of time. they place a higher priority on social status. If you live in a primarily Christian community don't expect anyone to listen, they would lose status by seriously considering your doubts.

I tend to think aliens shaped by natural selection will exhibit many of the same neurological adaptations that we do.

how much will you be charging for bar mitzvahs?

If humanity was forced to choose a simple optimization process to submit itself to I think capitalism would be our best bet.

Load More