X4vier

Wiki Contributions

Comments

Sorted by
X4vier42

I hear you that teenagers spending hours computing hundreds of analytic derivatives or writing a bunch of essays is a pretty sad time waste... But if incentives shifted so instead that time got spent perfecting a starcraft build order against the AI a hundred times or grinding for years to pull off carpetless star in SM64, this might be one of the few ways to make that time spent even more pointless... (And for most people the latter is no more fun than the former)

X4vier1-1

You might be right that the concept only applies to specific subcultures (in my case, educated relatively well-off Australians).

Maybe another test could be - can you think of someone you've met in the past who a critic might describe as "rude/loud/obnoxious" but despite this, they seem to draw in lots of friends and you have a lot of fun whenever you hang out with them?

X4vier10

Maybe an analogy which seems closer to the "real world" situation - let's say you and someone like Sam Altman both tried to start new companies. How much more time and starting capital do you think you'd need to have a better shot of success than him?

X4vier10

Out of interest - if you had total control over OpenAI - what would you want them to do?

X4vier2012

I think OP is correct about cultural learning being the most important factor in explaining the large difference in intelligence between homo sapiens and other animals.

In early chapters of Secrets of Our Success, the book examines studies comparing performance of young humans and young chimps on various congnitive tasks. The book argues that across a broad array of cognitive tests, 4 year old humans do not perform singificantly better than 4 year old chimps on average, except in cases where the task can be solved by immitating others (human children crushed the chimps when this was the case).

The book makes a very compelling argument that our species is uniquely prone to immitating others (even in the absense of causal models about why the behaviour we're immitating is useful), and even very young humnans have inate instincts for picking up on signals of prestige/compotence in others and preferentially immitating those high prestige poeple. Imo the arguments put forward in this book make cultral learning look like a very strong theory better in comparison to Machieavellian intelligence hypothesis,  (although what actually happend at a lower level abstraction probably includes aspects of both).

X4vier43

If we expect there will be lots of intermediate steps - does this really change the analysis much?

How will we know once we've reached the point where there aren't many intermediate steps left before crossing a crticial threshold? How do you expect everyone's behaviour to change once we do get close?

X4vier34

Doesn't make sense to use the particular consumer's preferencces to estimate the cruelty cost. If that's how we define the cruelty cost it then the buyer should already be taking it into account when making their purchasing decision, so it's not an exernality.

The externality comes from the animals themselves having interests which the consumers aren't considering

Answer by X4vier175

(This is addressed both to concerned_dad and to "John" who I hope will read this comment).

Hey John, I'm Xavier - hope you don't mind me giving this unsolicited advice but I'd love to share my take on your situation, and some personal anecdotes about myself and my friends which I think you might find useful (who I suspect had a lot in common with you when we were your age). I bet you're often frustrated about adults thinking they know better than you, especially when most of the time they're clearly not as sharp as you are and don't seem to be thinking as deeply as you about things - I'm 30, and my IQ/ raw cognitive horsepower is probably a little below the average user of this site - so I'll do my best to not fall into that same archetype.

First off John I think it's really fucking cool that you're so interested in EA/LW especially at such a young age - and this makes me think you're probably really smart, ambitious,  overflowing with potential, and have a huge amount of love and concern for your fellow sentient beings. Concerned dad: on balance, the fact that your son has a passionate interest in rationality and effective altruism is probably a really great thing - he's really lucky to have a father who loves him and cares about him as much as you clearly do. Your son's interest in rationality+effective altruism suggests you've helped produce a young man who's extremely intelligent and has excellent values (even though right now they're being expressed in a concerning way which makes you understandably worried). I'm sure you care deeply about John leading a happy life in which he enjoys wonderful relationships, an illustrious career,  and makes great contributions to society. In the long run, engaging with this community can offer huge support in service of that goal.

Before I say anything about stimulants/hallucinogens on the object level John, there's a point I want to make on the meta level - something which I failed to properly appreciate until a full 9 years after I read the sequences:

Meta Level

Imagine two people, Sam and Eric, a pair of friends who are both pretty smart 15 year olds. Sam and Eric are about equally clever and share similar interests and opinions on most things, with the only major personality difference being that Sam has a bit more of an iconoclastic leaning, and is very willing to act on surprising conclusions when provided strong rational arguments in favor of them, while Eric is a bit more timid to do things which his parents/culture deem strange and has a higher respect for Chesterson fences.

One night Sam and Eric go to a party together filled with a bunch of slightly older university undergraduates who all seem really cool and smart, and they end up in fascinating philosophical conversation, which could be about a great many different things.

Perhaps Sam and Eric ended up chatting with a group of socialists. People in the circle made a lot of really compelling points about the issues with wealth inequality - Sam and Eric both learn some really shocking facts about the level of wealth inequality in our society, and they hear contrasting anecdotes from different people who came from families of wildly different levels of wealth, who tell stories which make it clear that the disparity of advantage and opportunity and dignity they all experienced growing up was deeply unfair. Now I’m sure you, John, are already way too smart to fall for this (as may be the case for every example I’m about to give), but let’s imagine that Eric and Sam have never read something like I, pencil, so don’t yet have a good level of industrial literacy or a deep grasp of how impossible it is to coordinate a modern society without the help of price signals. This, the pair walk away from this conversation knowing many compelling arguments in favor of communism - and the only reason they have so far to doubt becoming a communist is a vague sense of “this seems extreme and responsible adults usually says this a really bad idea…”.

After this conversation, Eric updates a little more in favor of wealth distribution, and Sam becomes a full-on Marxist. Ten years from now, which of the pair do you think will be doing better in life?

But maybe the conversation wasn’t about communism! Maybe it was all about how, even though most people think the purpose of school is to educate children, actually there’s very little evidence that getting kids to study more has much of an effect on lifetime income and the real purpose of school is basically public funded babysitting. Upon realizing this, Eric updates a little bit towards not stressing too much about his English grades, but still gets out of bed and heads to school with the rest of his peers every day - while Sam totally gives up on caring about school at all and starts sleeping in, skipping class almost every, and plays a shitload of DOTA2 in is room alone instead. If this was all you knew about Sam and Eric, ten years from now, which of the two would you expect to be doing better in their career/relationships?

Or perhaps the conversation is all about climate change - Sam and Eric are both exposed to a group of people who are both very knowledgeable about ecology and climate systems, and who are all extremely worried about global warming. Eric updates slightly towards taking this issue seriously and supporting continued investment in clean energy technology, while Sam makes a massive update towards believing that the world in 30 years is likely to be nearly inhospitable, and resolves to never bother tying his money up in a retirement savings account and commits to never have children due to their CO2 footprint. Again, 10 years from now, I think Eric's reluctance to make powerful updates away from what’s “normal” will leave him in a better position than Sam.

Or maybe the conversation was about traditional norms surrounding marriage/monogamy. Sam and Eric are both in great relationships, but now, for the first time, are exposed to a new and exciting perspective which asks questions like 

  • “Why should one person have the right to tell another person who she/he can and can’t sleep with?”
  • “If I love my girlfriend, why shouldn’t I feel happy for her when she’s enjoying another partner rather than jealous?”
  • “Think about the beautiful feelings we experience being with our current partners, imagine how amazing it would be to multiply that feeling by many similar concurrent romantic relationships!”

Eric hears all this, finds it pretty interesting/compelling, but decides that the whole polyamory thing still feels a bit unusual, and marries his wonderful childhood sweetheart anyway, and they buy a beautiful house together. Sam on the other hand, makes a strong update in favor of polyamory - convinces his girlfriend that they ought to try an open relationship, and then ends up experiencing a horrific amount of jealousy/rage when his girlfriend starts a new relationship with his other friend, eventually leading to immense suffering and the loss of many previously great relationships.

Maybe they chatted about decentralized finance, and while Eric still kept 80% of his money in a diversified index fund, Sam got really into liquidity pooling+yield farming inflationary crypto tokens while hedging against price fluctuations using perpetual futures on FTX.

Maybe it was a chat about having an attractive physique - Eric starts exercising a little extra and eating a bit less junk food, whilst Sam completely stops eating his parents cooking, orders a shitload of pre-workout formula from overseas with a possibly dishonest ingredients list, starts hitting the gym 5 times a week, obsessively measures his arms with a tape measure, feels ashamed to not be as big as Chris Hemsworth, and sets alarms for 3am in the middle of the night so that he’s able to force more blended chicken breast down his throat.
 

Maybe it’s a chat about how group living actually makes a lot of sense and enables lots of economies of scale/gains from trade. Eric resolves to try out a 4-person group house when he moves out of his parent’s, whilst Sam convinces a heap of friends to move out and start a 12-person house next month (which is predictably filthy, overrun with interpersonal drama, and leads to the share house eventually dissolving and everyone leaving on less-than-friendly terms).

Maybe they thought deeply about whether money really makes you happy beyond a certain level or not, and then upon reflection, Eric did a Google summer internship anyway while Sam didn’t bother to apply.

Or maybe the conversation was about one of countless other topics where thinking too much for yourself can be extremely dangerous! Especially for a sixteen year old.
 

I know it’s unfair for me to only write stories where Eric wins and Sam loses - and there’s definitely some occasions where that’s not true! Sometimes Eric does waste time studying for a test that doesn’t matter, maybe Eric would have got better results in the gym if he’d started on creatine sooner, maybe he should have taken Sam’s advice to bet more money on Biden winning the 2024 US election - but when Eric messes up by following the cultural wisdom too closely, it’s never a total disaster. In the worst case, Eric still ends up moderately happy and moderately successful but when Sam makes a mistake in the opposite direction, the downsides can be catastrophic.
 

Every single one of those anecdotes maps directly onto a real thing that’s actually happened to me or my partner or one of our LW-adjacent friends between the ages of 15 and 30.
 

John, just because you are smarter and better able to argue that the vast majority of people living within a culture, that doesn’t mean you’re smarter than the aggregated package of norms, taboos and cultural which has evolved around you (even if most of the time nobody can clearly justify them). If you haven’t read Secrets of Our Success yet, you should definitely check it out! It makes this point in ruthlessly convincing fashion.

The midwit meme format is popular for a reason - the world is filled with intellectual traps for smart people to fall into when they're not wise enough to pay the appropriate credit to "common sense" above their own reasoning on every single question.

Object Level

When faced with a situation similar to yours, what do we think Sam/Eric might each do?

Eric would perhaps start taking 100-300mg of caffeine each day (setting strict upper limits on usage), or even start cautiously experimenting with chewing a couple milligrams worth of nicotine gum on days when he has heaps of study to do.

Sam on the other hand, might google the diagnosis criteria for ADHD and lies to a psychiatrist in order to obtain an illegitimate adderall prescription.

I know this is only anecdotal, but I've witnessed this exact situation play out multiple times among my close friends, and each time dexamphetamine use has come just a little before disastrous outcomes (which I can't prove are linked to drug abuse, but it's very plausible).

Once you're 18 years old your dad has no right to control your behaviour, but none the less, in the support he's able to offer you could still be hugely valuable to you for decades to come, so I'm sure there is a massive space of mutually beneficial agreements you could come to involving you promising to not start using illegal/prescription drugs.

John and Concerned Dad, I'd love to chat more about this with either of you (and offer an un-anonomised version of literally all these anecdotes, please feel free to send me a private message)

X4vier10

For the final bet (or the induction base for a finite sequence), one cannot pick an amount without knowing the zero-point on the utility curve.

I'm a little confused about what you mean sorry - 

What's wrong with this example?: 

It's time for the final bet, I have $100 and my utility is 

I have the opportunity to bet on a coin which lands heads with probability , at  odds.

If I bet  on heads, then my expected utility is , which is maximized when .

So I decide to bet 50 dollars.

What am I missing here?

X4vier30

As far as I can tell, the fact that you only ever control a very small proportion of the total wealth in the universe isn't something we need to consider here.

No matter what your wealth is, someone with log utility will treat a prospect of doubling their money to be exactly as good as it would be bad to have their wealth cut in half, right?

Load More