expectation is the calculation of the “intended” or “target” value given a random variable:
\begin{equation} \mathbb{E}[M] = \sum_{x} x\ p(X=x) \end{equation}
- Standardize variables to \(z\) by dividing
- The correlation is simply their “product”: means of positive and negative groups
The expectation is the average of the counts of the data you have.
properties of expectation
these holds REGARDLESS of whether or not the variables you are doing is independent, IID, etc.
Linearity in the first slot
expectation has additivity and homogeneity.
\begin{equation} \mathbb{E}[aX+b] = a\mathbb{E}[X] + b \end{equation}
Closure under expectation
\begin{equation} E[X+Y] = E[X]+E[Y] \end{equation}
Unconscious statistician
\begin{equation} \mathbb{E}[g(x)] = \sum_{x \in X}^{} g(x) P(X=x) \end{equation}
whereby, if \(g\) is a normal function, you can just add up all the possible output. This property can be used to show the firts results.
conditional expectation
We can perform expectation via conditional probability.
\begin{equation} E[X|Y=y] = \sum_{x}^{} x \cdot p(X=x|Y=y) \end{equation}
law of total expectation
\begin{equation} \mathbb{E}[X] = \sum_{y}^{}\mathbb{E}[X|Y=y] P(Y=y) \end{equation}
what is the “background variable”? the \(y\) value above.