There is a lot of attention being given to AI software progress, potential regulation etc., however articles such as this celebrating potential progress towards a step change in AI ability seem to get little attention and a free pass in the press as being uncontroversial. 

As someone who believes a slow takeoff is likely (though not with enough certainty to be comfortable), I would like to see more emphasis placed on the potential disruption from hardware advancements. I contend that with our current production capacity of roughly 50 million GPUs per year, and given their current specifications, it is unlikely that they could pose an existential risk, regardless of the level of software progress. If this is the case, then slowing or halting the development of more advanced chip factories could be a straightforward way to mitigate such a threat.

From this perspective, potential progress in new hardware could be regarded more akin to advancements in biological, chemical, or sophisticated weaponry technologies, rather than just another feel-good tech story. Even commenting on such articles could have a measurable impact if done consistently and effectively.

New Comment
3 comments, sorted by Click to highlight new comments since: Today at 7:55 PM
[-]Ilio10mo20

I like the idea of putting GPU factories under the same kind of scrutiny as nuclear fuel, but I’m not sure the best move toward this is to link that with ideas of slow takeoff, or even with the rationalist community completely.

Two small points I don’t get are: why a ban on research rather than a ban on who’s certified to buy and rum GPUs and the like? Why do you mention biological and chemical (fields that seems not as properly secured as what IAEA does) rather than nuclear? I suspect in both cases it’s just because it’s second rank concerns that did not get your attention, but just checking in case there’s more to these points.

OK I linked the GPU factories with slow takeoff because if the current amount of compute in the wild is already X-risk because of fast takeoff, then restricting hardware won't necessarily help.

I suggest slowing down research because its the most upstream effective technique. If the research isn't done then its less likely a secret factory is built. No factory, no way for certification to fail, no political interference in certification.

I mentioned bio/chemical weapons just as examples of tech advances that are not cheered on, nothing else significant.

[-]Ilio10mo20

Ok now the link is clear for me thanks! and ok for « just as examples » as well. Still not sure research is the most efficient target for your purpose, but I don’t have a good model for this so that’s not telling much.