I am trying to deducing a system of quatifying the relationship between 2 events(variables) by using Bayes' theorem:
For assessing if event A and event B are related:
P(A|B)/P(A)=P(B|A)/P(B) means P(A|B)/P(A)-P(B|A)/P(B)=0.
It tells the direct relationship between A and B. Thus, in dynamic setting, changes in P(A|B)/P(A) should be the same as P(B|A)/P(B). If they are not the same, then we can say that they are not directly related or may have hidden errors.
if [P(A|B)/P(A)] or [P(B|A)/P(B)]=1, we can say they are unrelated.
And for event A and B, if [P(A|B)/P(A)] or [P(B|A)/P(B)]>1, it means then are positively related, vice versa.
However, if real life, it's hard to get P(A|B) or P(B|A). What we usually got maybe only be their changes, over time or over other parameters.
So I propose, say that the greater the deviation of d[P(A|B)/P(A)]/d[P(B|A)/P(B)] from 1, provided that P(A|B)/P(A)] or [P(B|A)/P(B)]=/=1, the greater the relationship between A and B and vice versa.
And we can also propose that [d[P(A|B)/P(A)]/dt]/[d[P(B|A)/P(B)] /dt] should be proportionate, if in a closed system.
And if d[P(A|B)/P(A)] or d[P(B|A)/P(B)] is 0, it means A and B are unrelated! or at least no more related. In real life, there's so call marginal effect, which can be illustrated by this.
I have a feeling that we can dive further into this kind of things. Any comments is welcomed.
Maybe it can be used to differentiate signal from noise? or detecting any confounders?
I am trying to deducing a system of quatifying the relationship between 2 events(variables) by using Bayes' theorem:
For assessing if event A and event B are related:
P(A|B)/P(A)=P(B|A)/P(B) means P(A|B)/P(A)-P(B|A)/P(B)=0.
It tells the direct relationship between A and B. Thus, in dynamic setting, changes in P(A|B)/P(A) should be the same as P(B|A)/P(B). If they are not the same, then we can say that they are not directly related or may have hidden errors.
if [P(A|B)/P(A)] or [P(B|A)/P(B)]=1, we can say they are unrelated.
And for event A and B, if [P(A|B)/P(A)] or [P(B|A)/P(B)]>1, it means then are positively related, vice versa.
However, if real life, it's hard to get P(A|B) or P(B|A). What we usually got maybe only be their changes, over time or over other parameters.
So I propose, say that the greater the deviation of d[P(A|B)/P(A)]/d[P(B|A)/P(B)] from 1, provided that P(A|B)/P(A)] or [P(B|A)/P(B)]=/=1, the greater the relationship between A and B and vice versa.
And we can also propose that [d[P(A|B)/P(A)]/dt]/[d[P(B|A)/P(B)] /dt] should be proportionate, if in a closed system.
And if d[P(A|B)/P(A)] or d[P(B|A)/P(B)] is 0, it means A and B are unrelated! or at least no more related. In real life, there's so call marginal effect, which can be illustrated by this.
I have a feeling that we can dive further into this kind of things. Any comments is welcomed.
Maybe it can be used to differentiate signal from noise? or detecting any confounders?