Sorted by New

Wiki Contributions


Moore's Law, AI, and the pace of progress

Nice article.  Good that you spotted the DRAM problem, many people don't realize DRAM hit a scaling wall nearly 10 years ago.  It has to do with the amount of charge needed to provide a sensible change at the end of the wires.  As wires scale smaller their RC constant gets worse and competes with other factors that might improve, driving the capacitors to stay in the same range of total charge.  Meanwhile the diameter of the capacitors is tough to change, with the minimum diameter set by material constants of dielectric and voltage breakdown.  We found the best a while ago.  The only way to pack the capacitors closer is to reduce the difference between the widest part of the cylinder and the minimum - which requires perfecting the aspect ratio and minimizing fluctuations.  Slow, slow progress and when you reach perfection there remains that minimum diameter, rather like hitting the speed limit on transistors.

If you estimate the cost of a Graviton 2 core it comes out to about $5, but the 4GB of memory assigned to it cost about $12.  You can do similar calculations for Apple M1 series.   DRAM is already the cost limit, because it has for so long been the scaling laggard.

We will need new types of memory far more urgently than worrying about logic scaling.

Moore's Law, AI, and the pace of progress

Irreversible is normal computing, the operation makes a state change which does not allow you to go backwards.  Reversible computing is a lab curiosity at very small scale, using circuits which slide between states without dissipating energy and can slide the other way too.  As Maxim says, it is far-out speculation whether we can really build computers that way.