Carlos Ramirez

Sequences

AI Defense in Depth: A Layman's Guide

Wiki Contributions

Comments

Pivot!

The Presence of Everything has two posts so far, and both are examples of the sort of panoramic analogical thinking we need to undo the various Gordian Knots bedeviling us.

Pivot!

Disease is down. War is down. Poverty is down. Democracy is up (on the timescale of centuries). Photovoltaics are cheaper than coal. This all seems worthwhile to me. If world peace, health and prosperity aren't worthwhile then what is?

These things are worthwhile, but we lack critical stuff. In particular, it does not capture what makes us shudder when thinking of dystopias like the Combine. Which is: how well developed our spirituality is. You can't pin that down with a number.

I don't think there's a conflict between weird contemplative stuff and making the world better in a measurable way. If the two conflict then you're doing the contemplative stuff wrong.

There is a conflict if the scientistic/materialistic worldview continues its dominance, because that worldview insists only it is valid, and that the spiritual paths provide only psychotherapeutic copes, at best. That state of affairs is unacceptable.   

When evil people see good, they try to undermine it. When good people see good, they celebrate it.

How do you think I undermine science? I just point out there is a tenebrous principle currently underpinning it. Science can very easily proceed without that. Science can get back its heart, and its sanity too, probably with increased effectiveness too boot.

Pivot!

Do you think that there is a non-secular way forward? Did you previously (before your belief update) think there is a non-secular way forward?

 

Yes, I did always think there was a non-secular way forward for all sorts of problems. It's just that I realized AI X-risk is just one head of an immense hydra: technological X-risks. I'm more interested in slaying that hydra, than in coming up with ways to deal with just one of its myriad heads.

those indicators seem pretty meaningful for me. Life expectancy, poverty rates, etc.

Yeah, the indicators are worth something, but they are certainly not everything! Slavish devotion to the indicators renders one blind to critical stuff, such as X-risk, but also to things like humanity becoming something hideous or pathetic in the future.

Why are the standard arguments against religion/magic and for materialism and reductionism not compelling to you anymore?

The hard problem of consciousness combined with learning the actual tenets of Hinduism (read the Ashtavakra Gita), was the big one for me. Dostoyevsky also did a bang up job depicting the spiritual poverty of the materialist worldview.

Could you have stopped Chernobyl?

I think I'm the only one who found that confusing.

 

It makes sense because we don't have good stories that drill into our head negligence -> bad stuff, or incompetence -> bad stuff. When those things happen, it's just noise.

We have bad guys -> bad stuff instead. Which is why HBO's Chernobyl is rather important: that is definitely a very well produced negligence -> bad stuff story.

Could you have stopped Chernobyl?

Eh. We can afford to take things slow. What you describe are barely costs.

Could you have stopped Chernobyl?

Well, INSAG-7 is 148 pages that I will not read in full, as Chernobyl is not my primary interest. But I did find this in it:

5.2.2. Departure from test procedures 

It is not disputed that the test was initiated at a power level (200 MW(th)), well below that prescribed in the test procedures. Some of the recent comments addressed to INSAG boil down to an argument that this was acceptable because nothing in normal procedures forbade it. However, the facts are that: 

— the test procedure was altered on an ad hoc basis;

 — the reason for this was the operators' inability to achieve the prescribed test power level; 

— this was the case because of reactor conditions that arose owing to the previous operation at half power and the subsequent reduction to very low power levels; 

— as a result, when the test was initiated the disposition of the control rods, the power distribution in the core and the thermal-hydraulic conditions were such as to render the reactor highly unstable. When the reactor power could not be restored to the intended level of 700 MW(th), the operating staff did not stop and think, but on the spot they modified the test conditions to match their view at that moment of the prevailing conditions. Well planned procedures are very important when tests are to take place at a nuclear plant. These procedures should be strictly followed. Where in the process it is found that the initial procedures are defective or they will not work as planned, tests should cease while a carefully preplanned process is followed to evaluate any changes contemplated.

 5.2.3. Other deficiencies in safety culture 

The foregoing discussion is in many ways an indication of lack of safety culture. Criticism of lack of safety culture was a major component of INSAG-1, and the present review does not diminish that charge. Two examples already mentioned are worthy of emphasis, since they bear on the particular instincts required in reactor operation. The reactor was operated with boiling of the coolant water in the core and at the same time with little or no subccoling at the pump intakes and at the core inlet. Such a mode of operation in itself could have led to a destructive accident of the kind that did ultimately occur, in view of the characteristics of positive reactivity feedback of the RBMK reactor. Failure to recognize the need to avoid such a situation points to the flaws in operating a nuclear power plant without a thorough and searching safety analysis, and with a staff untutored in the findings of such a safety analysis and not steeped in safety culture. This last remark is especially pertinent to the second point, which concerns operation of the reactor with almost all control and safety rods withdrawn to positions where they would be ineffective in achieving a quick reduction in reactivity if 19 shutdown were suddenly needed. Awareness of the necessity of avoiding such a situation should be second nature to any responsible operating staff and to any designers responsible for the elaboration of operating instructions for the plant.

Sound like HBO's Chernobyl only erred in making it seem like only Dyatlov was negligent that night, as opposed to everyone in the room. But even without that, the series does show the big takeaway was that the USSR as a whole was negligent. 

Could you have stopped Chernobyl?

I highlight later on that Beirut is a much more pertinent situation. No control room there either, just failures of coordination and initiative.

Also, experts are not omnipotent. At this point, I don't think there are arguments that will convince the ones who are deniers, which is not all of them. It is now a matter of reigning that, and other, field(s) in.

Could you have stopped Chernobyl?

That's good to know, though the question remains why didn't anyone do that in Beirut.

Fauci 

I don't think reflexive circling-the-wagons around the experts happens in every context. Certainly not much of that happens for economists or psychometricians...

Could you have stopped Chernobyl?

I'll be there. Been thinking about what precisely to ask. Probably something about how it seems we don't take AI risk seriously enough. This is assuming the current chip shortage has not, in fact, been deliberately engineered by the Future of Humanity Institute, of course...

Load More