Is it possible to show that a normalised random variable has zero mean and unit variance?
Given some random variable $X_i$, is it possible to compute expectations of the normalised value, like: $$\mathbb{E}\biggl[\frac{X_i - \bar{x}}{s}\biggr],$$ where $\bar{x} = \frac{1}{N}\sum_{j=1}^N X_j$ is the sample mean and $s^2 = \frac{1}{N - 1} \sum_{j=1}^N (X_j - \bar{x})^2$ is the sample variance?
Obviously, I could treat $\bar{x}$ and $s$ as constants and use linearity in expectation to arrive at a simple, but not very interesting solution. However, I would be more interested in the case where $\bar{x}$ and $s$ are also treated as random variables. After all, if we assume $\forall j : \mathbb{E}[X_j] = \mu$ and $\mathbb{E}[X_j^2] = \sigma^2 + \mu^2$, we should have $\mathbb{E}[\bar{x}] = \mu$ and $\mathbb{E}[s^2] = \sigma^2$. As a result, it seems reasonable to expect that $\mathbb{E}\Bigl[\frac{X_i - \bar{x}}{s}\Bigr] = 0$ and $\mathbb{E}\biggl[\Big(\frac{X_i - \bar{x}}{s}\Big)^2\biggr] = 1$. However, I am not quite sure how this could be proven.
When considering only the centring of the random variable, I managed to get the following result: $$\begin{align} \mathbb{E}[X_i - \bar{x}] &= \mathbb{E}\Bigl[X_i - \frac{1}{N}\sum_j X_j\Bigr] \\ &= \frac{N - 1}{N} \mathbb{E}\Bigl[X_i\Bigr] - \frac{1}{N} \mathbb{E}\sum_{j \neq i} \Bigl[X_j\Bigr] \\ &= \frac{N - 1}{N} \mu - \frac{N - 1}{N} \mu = 0. \end{align}$$ Note that I explicitly took a little detour to emphasise the dependence between $X_i$ and $\bar{x}$. Is it possible to make a similar kind of derivation for the normalised variable?
Is there any way to proof that this normalised variable actually has zero mean and unit variance, despite the dependencies? Or is my assumption that the normalised random variable has zero mean and unit variance simply incorrect?
0 comment threads