Neel Nanda


The Point of Trade

My guess for missing things:

Economies of scale - it's probably easier to produce a lot of steel from a lot of iron, per unit kg of steel, than with a little bit of iron. So you want there to be concentration of raw materials.

Diminishing marginal returns - so this pushes towards a uniform distribution of everything

We need a standard set of community advice for how to financially prepare for AGI
But the global chip shortage means semiconductor foundries like Taiwan Semiconductor Manufacturing Co. are already scrambling to fill other orders. They are also cautious about adding new capacity given how finicky crypto demand has proven to be. Bernstein estimates that crypto will only contribute about 1% TSMC’s revenue this year, versus around 10% in the first half of 2018 during the last crypto boom.

Looking at the WSJ source, looks like it's actually arguing that Bitcoin mining wasn't a big cause of the global chip shortage. And that 1% was a low, and that it had previously been 10%.

Still less than I'd expected, but 10% seems plausibly enough to significantly boost profits?

We need a standard set of community advice for how to financially prepare for AGI
I think Vicarious AI is doing more AGI-relevant work than anyone

Interesting, can you say more about this/point me to any good resources on their work? I never hear about Vicarious in AI discussions

We need a standard set of community advice for how to financially prepare for AGI

One approach that feels a bit more direct is investing in semiconductor stocks. If we expect AGI to be a big deal and massively economically relevant, it seems likely that this will involve vast amounts of compute, and thus need a lot of computer chips. I believe ASML (Netherlands based) and TSMC (Taiwan based) are two of the largest semiconductor manufacturers and are publicly traded, though I'm unsure which countries let you easily invest in them.

Problems with this:

  • A bunch of their current business comes from crypto-mining, so this also has some crypto exposure. The stocks have done well over the last few years, and I believe this is mostly from the crypto boom than the AI boom
  • TSMC is based in Taiwan, and thus is exposed to Taiwan-China problems
  • This assumes AGI will require a lot of compute (which I personally believe, but YMMV)
  • It's unclear how much of the value of AGI will be captured by semiconductor manufacturers
The Alignment Forum should have more transparent membership standards

A similar bug - when I go to the AF, the top right says Log In, then has a Sign Up option, and leads me through the standard sign-up process. Given that it's invite only, seems like it should tell people this, and redirect them to make a LW account?

The Alignment Forum should have more transparent membership standards
I agree that most people don't read the manual, but I think that if you're confused about something and then don't read the manual, it's on you.

I think responsibility is the wrong framing here? There are empirical questions of 'what proportion of users will try engaging with the software?', 'how many users will feel confused?', 'how many users will be frustrated and quit/leave with a bad impression?'. I think the Alignment Forum should be (in part) designed with these questions in mind. If there's a post on the front page that people 'could' think to read, but in practice don't, then I think this matters.

I also don't think they could make it much more obvious than being always on the front page.

I disagree. I think the right way to do user interfaces is to present the relevant information to the user at the appropriate time. Eg, when they try to sign-up, give a pop-up explaining how that process works (or linking to the relevant part of the FAQ). Ditto when they try making a comment, or making a post. I expect this would exposure many more users to the right information at the right time, rather than needing them to think to look at the stickied post, and filter through for the information they want

The Alignment Forum should have more transparent membership standards

I think most people just don't read the manual? And I think good user interfaces don't assume they do

Speaking personally, I'm an alignment forum member, read a bunch of posts on there, but never even noticed that post existed

The Alignment Forum should have more transparent membership standards

Hmm, fair point. I feel concerned at how illegible that is though, especially to an academic outsider who wants to engage but lacks context on LW. Eg, I've been using AF for a while, and wasn't aware that comments were regularly promoted from LW. And if we're talking about perception of the field, I think surface level impressions like this are super important

The Alignment Forum should have more transparent membership standards

And the field overall also has vastly more of its discussion public than almost any academic field I can think of and can easily be responded to by researchers from a broad variety of fields

What do you mean by this? I imagine the default experience of a researcher who wants to respond to some research but has minimal prior exposure to the community, is to be linked to the Alignment Forum, try to comment, and not be able to. I expect commenting on LessWrong to be non obvious as a thing to do, and to feel low-status/not like having a real academic discussion

What is the Risk of Long Covid after Vaccination?

The risk of death from covid after vaccination is near zero and this seems to be the case despite the variants

This seems to be true, but this doesn't obviously imply the risk of long COVID is significantly decreased. As far as I'm aware, no one has really studied this. On priors I'd guess that vaccines help a bunch, but I don't understand what's going on here very well.

And I think this is an important question, long COVID seems to represent a lot of the harm of COVID to young people. If case rates in your area aren't that low, this definitely seems like a valid question to ask

Load More