LESSWRONG
LW

Bill Walsh
0020
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No wikitag contributions to display.
AI Timelines
Bill Walsh3mo1-4

Thank you!

Here's something that I suspect is getting missed:  AI... Even a hyperintelligent AI, doesn't have hands.  Even a post singularity AI can't build a power plant or a transmission line.  Furthermore, the map is not the territory, a hyperintelligent AI might be able to solve the string theory equations, and design a futuristic power plant, like Tony Stark's arc reactor, but it's <1% likely to be something we can build with present day tooling that we have at scale.  We're almost certainly going to have to (assuming that it gives us the designs) build new factories to build the tooling, to build the factories to build the tooling, to build the factory to build the arc reactors  Likely (90%+).  And that's AFTER we build the exoscale LHC to validate the theoretical predictions (because any number of theories with valid math have been proven wrong).  Who builds that?    All of that is going to have to be built by humans, because we don't have the robots, or the factories to build the robots, or...  And during this time, civilization can't "stop", those humans still need to be fed, need entertainment, and sleep, and downtime.  Sure, there may be more labor available since pretty much all office jobs will be gone, but training those humans to be useful an a construction site, or factory floor?  Not so trivial.

And no guiding intelligence, regardless of how smart, can escape those constraints.  It may be able to tell us exactly where to go to find the rare earths, but it can't move the dirt, build the refinery, or fabricate the finished products.  

Dyson spheres?  Where is the rocketry?  Orsay it does better than that, say, repulsive lift off of earth's magnetic field, where are the miles of superconductors going to come from?  New superconductive materials?  Where's the factory to build them?  I doubt that you're going to just "retool" a present wire drawing factory to draw unobtanium dioxide wire.  Where's the asteroid mining infrastructure going to come from in the 5 year timescale?  

IF the Manhattan project had been furnished ab initio with a complete blueprint of an A bomb, complete with BOM, where to source the required materials, it would STILL have taken likely 4 years to build the first one.  Oak ridge wasn't built in a day.  And that's to build a single 5 ton device, now thousands of power plants, thousands of mines, refineries, fabs, thousands of miles of transmission lines, etcetera.  the scale of 1000x ing power generation is similarly 1000X the scale of the manhattan project or the moonshot.

A superintellignece can accelerate all of this, most definitely, but only SO much.  you can't build an Iphone with a rock.  Even with a detailed schematic.

Reply
Winning the power to lose
Bill Walsh3mo*10

New here, so, please bear with me if I say things that have been gone over with a backhoe in the past.  There's a lot of reading here to catch up on.  

So, AI development isn't just an academic development of potentially dangerous tools.  It's also something much, much scarier.  An arms race.  In cases like this, where the "first across the post" takes the prize, and that prize is potentially everything, the territory favors the least ethical and cautious.  We can only restrain, and slowly develop our ai developers, we have little influence or power over Chinese, Saudi, or Russian (among others) developments.  In a case like this, where development is recursive, those more willing to "gamble" have higher odds of winning the prize.  

That's not really an argument for absolute, "Floor it and pray for the best" type development, but it is an argument for "as fast as you can with reasonable safety".

Now, there's another aspect to consider:  Infrastructure.  Even IF "the singularity" were to happen tomorrow, assuming that it isn't outright suicidaliy bloody minded, it'll be a minimum 20 to 40 years until it can actually have the level of infrastructure to take over/destroy humanity.  There are a lot of places in the various supply chains that are not, at present, replaceable by even an infinitely smart AI.  We still have miners, truck drivers, equipment operators, Iphones are still assembled by human hands, all repair work of everything is still done with human hands.  This means that if the singularity were to happen today, then the Deus ex machina would have 2 options.  Make nice with humans, OR destroy itself.  Until there are FAR more capable autonomous robots numbered in the tens to hundreds of millions, that will remain true.  Which robots will have to be built in factories constructed by humans, using materials transported by humans, mined by humans, refined by humans, and crafted into finished products by humans.  A lot of the individual steps are automated, the totality of the supply chain is wholly dependent on human labor, skill, and knowledge.  And the machines that could do those jobs don't exist now.  Nor does the energy production infrastructure to run the datacenters and machines at current.  

All of which means that, at current, even the MOST evil AI would be possible to "stop".  Would it possibly be very bad (tm)?  yes.  It could, conceivably, kick us back to pre-internet conditions, which would be BAD.  But not extinction level bad unless it happens well beyond the predictability horizon.  

Which, in turn, means that what it would do in a "boots on the ground" sense is place an infinitely smart "oracle" in the hands of whoever develops it first.  That itself is frightening enough, But it won't be the AI that ends humanity if it happens while humans are still controlling most of the supply chain steps, it'll just hand it over to the entity (person, government, corporation) that creates it first.  

Which, again, in turn, means that the call is, paradoxically, for the entity you see as the "most ethical" in it's desired use of AI to behave the least ethically in it's development.  Who would you prefer to have "god on a leash"?  Sam Altman...  or Xi Jinping?

Again, sorry if this post went over a pile of things that were said before.

Reply
No posts to display.