Advanced Properties of Probability Distributions

Definition 1: If a continuous random variable x has frequency function f(x) then the expected value of g(x) is

image3145

Property 1: If g and h are independent then

image3146

Proof: Similar to the proof of Property 1b of Expectation

Definition 2: If a random variable x has frequency function f(x) then the nth moment  Mn(x0) of f(xabout x0 is

Moment

We also use the following symbols for the nth moment around the origin, i.e. where x0 = 0

Moment around origin

The mean is the first moment about the origin.

Mean and moment

We use the following symbols for the nth moment around the mean

Moment around mean

The variance is the second moment about the mean

Variance and moment

Definition 3: The moment-generating function of a discrete random variable x with frequency function f(x) is a function of a dummy variable θ given by

Moment generating function discrete

The moment-generating function for a continuous random variable is

Moment generating function continuous

Property 2: If the moment generating function of x for frequency function f(x) converges for each k, then

image3157

Proof: We provide the proof where x is a discrete random variable. The continuous case is similar.

Since in general,
image3158

it follows that

image3159

image6049

Thus, the k+1th term in the power series expansion of the moment-generating function is

image3161

The result now follows by induction on k.

Theorem 1: A distribution function is completely determined by its moment-generating function. I.e. two distribution functions with the same moment generating function are equal.

Corollary 1: If x is a random variable that depends on n with frequency function fn(x) and y is another random variable with frequency function g(x), then if

image3164

then
image3165

Definition 4: If x is a discrete random variable with frequency function f(x), then the moment generating function of g(x), is

image3166

The equivalent for a continuous random variable x is

image5081

Properties 3: Where c is a constant and g(x) is any function for which Mg(x)(θ) exists

image3171

image3172

Proof: We prove the property where x is a discrete random variable. The situation is similar where x is continuous.

image3173

image3174

Property 4: The moment-generating function of the sum of n independent variables is equal to the product of the moment-generating functions of the individual variables; i.e.

image3175

Proof: Since the xi are independent, so are the e^{\theta x_i}, and so by Property 2

image3177

Theorem 2 (Change of variables technique): If y = h(x) is an increasing function and f(x) is the frequency function of x, then the frequency function g(y) of y is given by

image3179

Proof: Let G(y) be the cumulative distribution function of y, let h-1 be the inverse function of h, and let u = h-1 (t). Then

image3184Thus
image3185

Now by changing variable names, we have

image3186

where x = h-1 (y) and so y = h(x).

Corollary 2: If y = h(x) is a decreasing function and f(x) is the frequency function of x, then the frequency function g(y) of y is given by

image3189

Corollary 3: If z = t(x, y) is an increasing function of y, keeping fixed, and f(x, y) is the joint frequency function of and y, and h(x, z) is the joint frequency function of x and z, then

image3192

Proof: If z = t(x, y) is an increasing function of y, keeping x fixed, and g(y|x) is the frequency function of y|x, and k(z|x) is the frequency function of z|x), then by the theorem

image3197

Now let f(x, y) be the joint frequency function of x and y. Then f(x, y) = f(x) · g(y|x). Similarly, if h(x, z) is the joint frequency function of x and z, we have h(x, z) = h(x) · k(z|x). Thus

image3201

Since both f and h are the pdf for x, f(x) = h(x), and so we have

image3192

Corollary 4: If z = t(x, y) is a decreasing function of y, keeping fixed, and f(x, y) is the joint frequency function of and y, and h(x, z) is the joint frequency function of x and z, then

image3203

Example 1: Suppose x has pdf f(x) = e-x where x ≥ 0, and y =\sqrt{x}. Find the pdf g of y

Since \sqrt{x} is an increasing function, where x = y2, we get

image3208

Example 2: Suppose x has pdf f(x) = e-x for x > 0 and y has pdf g(x) = e-y for y > 0, and suppose that x and y are independently distributed. Define z = y/x. What is the pdf for z?

From Corollary 3, for fixed x > 0, z = y/x is increasing (since y > 0), and so we have

image3211

But since x and y are independently distributed,

image3212

Combining the results,
image3213

and so
image3214

Let w = x(1+z). Then x = w/(1+z). and dx = dw/(1+z). It now follows that

image3218

Example 3: Suppose x has standard normal distribution N(0, 1). What is the pdf of the random variable z = |x| and what is the mean of this distribution?

By Definition 1 of Basic Characteristics of the Normal Distribution, the pdf of x is (with μ = 0 and σ = 1)

image7204

and so the probability distribution function is

image7205

Now |x| < a is equivalent to –a < x < a, and so we have the following formula for z’s distribution function G(z):

image7206

Since the pdf g(z) is the derivative of G(z), it follows that

image7207

We next use the following using the substitution

image7208

and soimage7209

Since z ≥ 0, we have

image7210

Property 5: If x ~ N(0, σ2), then the mean of |x| is \sigma \sqrt{2/\pi}

Proof: Let z = |x/σ|. Thus z ~ N(0, 1), and so as we saw in Example 3, E[z] = \sqrt{2/\pi}. But z = |x|/σ, and so |x| = σz, from which it follows that E[|x|] = \sigma \sqrt{2/\pi}.

References

Soch, J. (2020) Proof: Moment-generating function of the normal distribution. The book of statistical proofs
https://statproofbook.github.io/P/norm-mgf.html

Hoel, P. G. (1962) Introduction to mathematical statistics. Wiley

Leave a Comment