Logically, disjunctive arguments are just conjunctive arguments on the negations. That makes it pretty clear that the same errors can apply to both.
Well, according to De Morgan's laws, is indeed equivalent to . So if is high (because is low), is high. However, conjunctive arguments usually rely on an assumption of independence, while disjunctive arguments assume mutual exclusivity. I'm not sure whether these properties can be transformed into each other when switching between disjunctive and conjunctive arguments.
I've usually seen them both conditionalized, implicitly or explicitly.
The conjunctive argument is often presented something like "you need all of A,B,C,... . A is moderately likely, and even when A is true then B is moderately likely, and when ... . But this means that the end result is not likely at all!"
Similarly for disjunctive: "Any of A,B,C... is sufficient. A is moderately likely, and even if A isn't true then B is moderately likely, ... . So the end result is very likely!"
A similar thing was pointed out 14 years ago in this post. Anyway, I like the way you phrased it, very precise.
I'd say the general lesson is probably this: Arguments of the form of "X is highly conjunctive in ways 1, 2, 3, ..., n, therefore it is very unlikely" and "X is highly disjunctive, in ways 1, 2, 3, ..., n, therefore it is very likely" are not necessarily wrong, but easily misleading, because they trick us into underestimating or overestimating probabilities. The best solution is probably to avoid making these types of arguments and only present a few (strong) reasons for/against something.
Suppose we want to make an event seem likely. If we use the above method but slightly over-estimated the sub-event probabilities and use a large number of sub-events, then the resulting final probability will inevitably be very large. Because people tend to find moderate-range probabilities reasonable, this would be a superficially compelling argument even if it results in a massive over-estimation of the final probability. I propose this is a kind of reverse multiple-stage fallacy.
I suggest that this post is an instance of this fallacy. I gestured at the issue in my response but didn't have a name for it. A nonexhaustive list with at least 12 listed subevents of "is humanity evil", each of which is "there's some chance that this specific thing is humanity being evil".
Except that I doubt that it is such an instance and not an instance of 12 subevents[1] of mankind already[2] making potential severe mistakes. For example, while most people here are unlikely to buy the problem of secularism, they understand that destruction of nature and/or neglect for future generations is likely a mistake.
Factory farming, wild animal suffering, neglect for foreigners and/or future generations, abortion, mass incarceration, declining birth rates, natural mass fetus death, animal slaughter, secularism leading to many people going to hell, destruction of nature, child-bearing. However, I believe that these subevents aren't that independent.
For comparison, Daniel Kokotajlo's recent quick take implies that such mistakes will be FAR more abundant once the realm of possibilities expands itself.
Assume we want to know the probability that two events co-occur (i.e. of their conjunction). If the two events are independent, the probability of the co-occurrence is the product of the probabilities of the individual events, P(A and B) = P(A) * P(B).
In order to estimate the probability of some event, one method would be to decompose that event into independent sub-events and use this method to estimate the probability. For example, if the target event E = A and B and C, then we can estimate P(E) as P(A and B and C) = P(A) * P(B) * P(C).
Suppose we want to make an event seem unlikely. If we use the above method but slightly under-estimated the sub-event probabilities and use a large number of sub-events, then the resulting final probability will inevitably be very small. Because people tend to find moderate-range probabilities reasonable, this would be a superficially compelling argument even if it results in a massive under-estimation of the final probability. This has been called the multiple-stage fallacy.
Assume we want to know the probability that either of two events occurs (i.e. of their disjunction). If the two events are mutually exclusive, the probability of the disjunction is the sum of the probabilities of the individual events, P(A or B) = P(A) + P(B).
In order to estimate the probability of some event, one method would be to decompose that event into mutually exclusive sub-events and use this method to estimate the probability. For example, if the target event E = A or B or C, then we would estimate P(E) as P(A or B or C) = P(A) + P(B) + P(C).
Suppose we want to make an event seem likely. If we use the above method but slightly over-estimated the sub-event probabilities and use a large number of sub-events, then the resulting final probability will inevitably be very large. Because people tend to find moderate-range probabilities reasonable, this would be a superficially compelling argument even if it results in a massive over-estimation of the final probability. I propose this is a kind of reverse multiple-stage fallacy. In practice, I rarely see people actually make explicit estimations by this method, which makes sense since usually the disjunction could involve so many events as to be impractical. Instead, in the disjunctive case, a person might just say something like "the case for X is disjunctive" and the over-estimation is implicit.
Of course, not all disjunctive arguments are necessarily subject to this critique. Over-estimation of the components (either explicitly or implicitly) is required.