Independent Random Variables - Definition, Properties & Expectation
Random variable is a function mapping numerical values to each experiment. Two random variables are known as independent random variables if they cannot give any information about each other. If two random variables are given then value of one variable will not affect the probability of getting the value of another, then those two variables are independent. The basic property of independent random variables is that if there are two variables $A$ and $B$, then expected value of $AB$ equal the product of expected value of $A$ and expected value of $B$. 


Two random variables $A$ and $B$ are said to be independent if and only if,

$P(\{x \epsilon A\}\ \cap \{y \epsilon B\})$ = $P(\{x \epsilon A\})P(\{x \epsilon B\})$

for any events $\{X \epsilon A\} \text \{and\} \{Y \epsilon B\}$.

Two random variables are independent if one of them does not convey any information about the value of the other one.

An independent variable will not be affected by any other variable and can stand on its own. If we have two boys $A$ and $B$ then the age of $A$ is independent of the age of $B$.

Sum of Independent Random Variables

The sum of independent random variables is known as convolution of their distributions. Let $A$ and $B$ be two independent random variables and 

$m_1(x),\ m_2(x)$ be their distributions respectively. Then, the convolution of $m_1,\ m_2,\ m_3(x)$ = $m_1(x)\ \times\ m_2(x)$ is given by,

$m_3(j)$ = $\sum_{k}^{ }\ m_1(k).m_2(j-k)$

for $j$ = $...,\ -2,\ -1,\ 0,\ 1,\ 2,\ ...$

The function $m_3(j)$ is the distribution function of $C$ = $A\ +\ B$.


The properties of independent random variables are given here:
1) The joint distribution function of two independent variables factors into the product of their marginal distribution functions.

2) For two continuous random variables $X$ and $Y$, they are independent if and only if for any two densities of $X$ and $Y$, their product becomes the joint density of $X$ and $Y$.

3) If $X$ and $Y$ are two independent random variables, and $P$ = $g(X)$ and $Q$ = $h(Y)$ then $P$ and $Q$ are also independent to each other.


If $X$ and $Y$ are two independent random variables, then they are mean independent which can be defined as,

$E(XY)$ = $E(X)E(Y)$

If two random variables are independent then their correlation and covariance is zero. But this does not imply that if covariance and correlation of two random variables is zero then they are independent.

Let $E(X)$ = $0.7$ and $E(Y)$ = $0.5$ and $X$ and $Y$ be two independent random variables. Then find the value of $E(XY)$.

$E(XY)$ = $E(X)E(Y)$ as $X$ and $Y$ are independent random variables.

Hence, $E(XY)$ = $0.7\ \times\ 0.5$ = $0.35$