I generally agree with this thought train of concern. That said, if the end state equilibrium is large states have counterforce arsenals and only small states have multi-megaton weapons, then I think that equilibrium is safer in terms of expected death because the odds of nuclear winter are so much lower.
There will be risk adaptation either way. The risk of nuclear war may go up contingent on their being a war, but the risk of war may go down because there are lower odds of being able to keep war purely conventional. I think that makes assessing the net risk pretty hard, but I doubt you'd argue for turning every nuke into a civilization ender to improve everyone's incentives: at some point it just isn't credible that you will use the weapons and this reduces their detergent effect. There is an equilibrium that minimizes total risk across sources of escalation, accidents, etc. and I'm trying to spark convo toward figuring out what that equilibrium is. I think as tech changes, the best equilibrium is likely to change, and it is unlikely to be the same arms control as decades ago, but I may be wrong about the best direction of change.
Precision isn't cheap. Low yield accurate weapons will often be harder to make than large yield inaccurate weapons. A rich country might descend the cost curve in production, but as long the U.S. stays in an umbrella deterrence paradigm that doesn't decrease costs for anyone else, because we don't export nukes.
This also increases the cost for rogue states to defend their arsenals (because they are small, don't have a lot of area to hide stuff, etc.), which may discourage them from gaining them in the first place.
I meant A. The Beirut explosion was about the yield of a mini-nuke.
I could imagine unilateral action to reduce risk here being good, but not in violation of current arms control agreements. To do that without breaking any current agreements, that means replacing lots of warheads with lower yields or dial yields, and probably getting more conventional long-range precision weapons. Trying to replace some sub-launhed missiles with low yield warheads was a step in that direction.
There's a trade-off between holding leverage to negotiate, and just directly moving to a better equilibrium, but if you are the U.S., the strategy shift may just increase your negotiating power since the weapons are more useable.
The main thing I want to advocate is for people to debate these ideas to see if there is a potentially better equilibrium to aim for, and to chart a path to it. I don't want people to blindly assume I am right.
I think you need legible rules for norms to scale in an adversarial game, so it can't be direct utility threshold based rules.
Proportionality is harder to make legible, but when lies are directed at political allies that's clear friendly fire or betrayl. Lying to the general public also shouldn't fly, that's indiscriminate.
I really don't think lying and censorship is going to help with climate change. We already have publication bias and hype on one side, and corporate lobbying + other lies on the other. You probably have to take another approach to get trust/credibility when joining the fray so late. If there were greater honesty and accuracy we'd have invested more in nuclear power a long time ago, but now that other renewable tech has descended the learning curve faster different options make sense going forward. In the Cold War, anti-nuclear movements generally got a bit hijacked by communists trying to make the U.S. weaker and to shift focus from mutual to unilateral action... there's a lot of bad stuff influenced by lies in distant past that constrain options in the future. I guess it would be interesting to see what deception campaigns in history are the most widely considered good and successful after the fact. I assume most are ones with respect to war, such as ally deception about the D-Day landings.
It's not an antidote, just like a blockade isn't an antidote to war. Blockades might happen to prevent a war or be engineered for good effects, but by default they are distortionary in a negative direction, have collateral damage, and can only be pulled off by the powerful.
While it can depend on the specifics, in general censorship is coercive and one sided. Just asking someone to not share something isn't censorship, things are more censorial if there is a threat attached.
I don't think it is bad to only agree to share a secret with someone if they agree to keep the secret. The info wouldn't have been shared in the first place otherwise. If a friend gives you something in confidence, and you go public with the info, you are treating that friend as an adversary at least to some degree, so being more demanding in response to a threat is proportional.
Vouchers could be in the range of competition, but if people prefer basic income to the value they can get via voucher at the same cost-level then there has to best substantial value that the individual doesn't capture to justify it. School vouchers may be a case of this, since education has broader societal value.
Issue there is that implicitly U.S. R&D is subsidizing the rest of the world since we don't negotiate prices but others do. Seems like an unfortunate trade-off between the present and the future/ here and other places, except when there is a lack of reinvestment of revenue into R&D.
I agree with your point in general. In these cases, I'm specifically focusing on regulations for issues that evaporate with central coordination: - Government is doing the central coordinating, so overriding zoning shouldn't result in uncoordinated planning: gov will also incur the related infrastructure costs.- If you relax zoning and room size minimums everywhere, the minimum cost to live everywhere decreases, so no particular spot becomes disproportionately vulnerable to concentrating the negative externalities of poverty while simultaneously you decrease housing cost based poverty everywhere.