By causal direction (X causes Y) I mean that in the induced graph of the structural causal model, X appears before Y. The graph induced by a structural causal model has an edge from X to Y if there is a structural equation Y = F(U_i, X, ... ).
It refers to Example 9.3 and the associated lemmas.
This example is a general rephrasing of Example 1 of finite factored sets in terms of random variables and structural causal models.
Example 9.3
We observe two nonconstant, (real valued) random variables and . We assume that they are part of an unobserved true underlying causal model (𝑈, 𝑉 , 𝐹, ℙ), where 𝑉 = and {1, 2} ⊆ 𝐼. We only observe the distribution of 𝑉 . Let . Suppose that . Furthermore, we assume that ℙ is in ‘general position’. This means that this independence is not due to the specific choice of ℙ. Formally, for all causal models (𝑈, 𝑉 , 𝐹, 𝑃) s.t. 𝑃 ∼ ℙ.
It says that any structural causal model that models the independence of and Z structurally (it is not a coincidence of precise numerical values of ℙ), then in this structural causal model has , i.e. in the graph induced by the causal model through it's equation, always is an ancestor of .
The linked paper introduces the key concept of factored spaced models / finite factored sets, structural independence, in a fully general setting using families of random elements. The key contribution is a general definition of the history object and a theorem that the history fully characterizes the semantic implications of the assumption that a family of random elements is independent. This is analogous to how d-separation precisely characterizes which nodal variables are independent given some nodal variables in any probability distribution which fulfills the markov property on the graph.
Abstract: Structural independence is the (conditional) independence that arises from the structure rather than the precise numerical values of a distribution. We develop this concept and relate it to d-separation and structural causal models.
Formally, let be an independent family of random elements on a probability space . Let , , and be arbitrary -measurable random elements. We characterize all independences implied by the independence of and call these independences structural. Formally, these are the independences which hold in all probability measures that render independent and are absolutely continuous with respect to , i.e., for all such , it must hold that .
We introduce the history , a combinatorial object that measures the dependence of on for each given . The independence of and given is implied by the independence of if and only if almost surely with respect to .
Finally, we apply this d-separation-like criterion in structural causal models to discover a causal direction in a toy setting.