To define the concept of independence, consider two scalar-valued random variables y1 and y2. Basically, the variables y1and y2 are said to be independent if information on the value of y1 does not give any information on the value of y2, and vice versa. Above, we noted that this is the case with the variables s1, s2 but not with the mixture variables x1, x2.
Technically, independence can be defined by the probability densities.
Let us denote by
p(y1,y2) the joint probability density function
(pdf) of y1 and y2. Let us further denote by p1(y1) the
marginal pdf of
y1, i.e. the pdf of y1 when it is considered alone:
The definition can be used to derive a most important property of
independent random variables. Given two functions, h1 and h2,
we always have