Causal Inference Notes

Causal Inference in Statistics #

Questions #

Advanced Data Analysis from an Elementary Point of View (Ch. 18-23) #

Chapter 18 #

Exercises #

18.2. Proof that every path must go through a collider:

Observe that every path between two exogenous variables starts with variables going in the opposite direction. Thus, if we imagine walking the path from one exogenous variable to another, we see that the direction of the arrows has to switch at a node, which will be a collider.

The answer doesn’t change if we remove the descendant criteria since the descendants only factor in if they’re colliders too.

Chapter 19 #

I found this quote very helpful:

Now suppose we want to ask what the effect would be, causally, of setting \( X_c \) to a particular value \( x_c \) . We represent this by “doing surgery on the graph”: we (i) eliminate any arrows coming in to nodes in \( X_c \) , (ii) fix their values to \( x_c \) , and (iii) calculate the resulting distribution for X e in the new graph. By steps (i) and (ii), we imagine suspending or switching off the mechanisms which ordinarily set \( X_c \) . The other mechanisms in the assemblage are left alone, however, and so step (iii) propagates the fixed values of \( X_c \) through them. We are not selecting a sub-population, but producing a new one. If setting \( X_c \) to different values, say \( x_c \) and \( x’_c \) , leads to different distributions for \( X_e \) , then we say that \( X_c \) has an effect on \( X_e \) — or, slightly redundantly, has a causal effect on \( X_e \) . Sometimes 4 “the effect of switching from \( x_c \) to \( x’_c \) ” specifically refers to a change in the expected value of \( X_e \) , but since profoundly different distributions can have the same mean, this seems needlessly restrictive. If one is interested in average effects of this sort, they are computed by the same procedure.

Exercises #

19.1 In the graph shown in the figure, the path from the asbestos node to the yellow teeth node is blocked by a collider on cancer. Thus, intervening on it will have no impact on the value of yellow teeth.

Chapter 20 #

Another helpful quote:

The reason these are different is that the latter represents taking the original population, as it is, and just filtering it to get the sub-population where \( X = x \). The processes which set \( X \) to that value may also have influenced \( Y \) through other channels, and so this distribution will not, typically, really tell us what would happen if we reached in and manipulated \( X \).

Note: Shalizi’s coverage of IV calculations is apparently wrong and should therefore be ignored.

Exercises #

20.1 The causal graph for a randomized controlled includes 4 nodes, \( A \), \( T \), \( O \), and \( C \).

One path is \( A \rightarrow T \rightarrow O \) and the other is \( C \rightarrow O \). \( A \perp C \) and \( T \perp C \) blocked by the collider \( O \). Since there are no backdoor paths in this graph from \( A \) to \( O \), a simple form of the backdoor criterion conditioned on \( \emptyset \) gives us \[ \Pr(O \mid \text{do}(T)) = \Pr(O \mid T). \]

comment: <> (\[ \Pr(Y \mid \text{do}(X = x)) = \Pr (Y |X = x, \text{Pa}(X) = t) \Pr(\text{Pa}(X) = t). \]) comment: <> (First, we group nodes not equal to \( X \) or \( Y \) into two sets, \( V \) and \( T \). \( T \) contains the parents of \( X \), \( \text{Pa}(X) \); \( V \) contains all nodes other than those in \( X \), \( Y \), and \( T \). Observe that) comment: <> (\[ ) comment: <> (\begin{align}) comment: <> (\Pr(Y = y, X = x’, T = t, V = v \mid \text{do}(X = x)) = \mathbf{1}[x = x’]) comment: <> (\end{align})