Recent interviews with Eliezer:
The bug patches / epiphanies / tortoises / wizardry square from Small, Consistent Effort: Uncharted Waters In the Art of Rationality
The nanobots, from the bloodstream, in the parlor, Professor Plum.
You could have written Colonel Mustard!
Figure out why it's important to you that your romantic partner agree with you on this. Does your relationship require agreement on all factual questions? Are you contemplating any big life changes because of x-risk that she won't be on board with?
Would you be happy if your partner fully understood your worries but didn't share them? If so, maybe focus on sharing your thoughts, feelings, and uncertainties around x-risk in addition to your reasoning.
I tried a couple other debates with GPT-4, and they both ended up at "A, nevertheless B" vs. "B, nevertheless A".
I like your upper bound. The way I'd put it is: If you buy $1 of Microsoft stock, the most impact that can have is if Microsoft sells it to you, in which case Microsoft gets one more dollar to invest in AI today.
And Microsoft won't spend the whole dollar on AI. Although they'd plausibly spend most of a marginal dollar on AI, even if they don't spend most of the average dollar on AI.
I'm not sure what to make of the fact that Microsoft is buying back stock. I'd guess it doesn't make a difference either way? Perhaps if they were going to buy back $X worth of shares but then you offer to buy $1 of shares from them at market price, they'd buy back $X and sell you $1 for a net buyback of $(X-1) and you still have an impact of $1.
I like the idea that buying stock only has a temporary effect on price. If the stock price is determined by institutional investors that take positions on the price, then maybe when you buy $1 of stock, these investors correct the price immediately, and the overall effect is to give those investors $1, which is ethically neutral? James_Miller makes this point here. But I'd like to have a better understanding of where the boundary lies between tiny investors who have zero impact and big investors who have all the impact.
Or maybe the effect of buying $1 of stock is giving $1 to early Microsoft investors and employees? The ethics of that are debatable since the early investors didn't know they were funding an AGI lab.
What's more, even selfish agents with de dicto identical utility functions can trade: If I have two right shoes and you have two left shoes, we'd trade one shoe for another because of decreasing marginal utility.