Expectation

Definition 1: If a discrete random variable x has frequency function f(x) then the expected value of g(x) is defined as

Expected value of a discrete function

Observation: The equivalent for a continuous random variable x is

Expected value of a continuous function

This is the total area between the curve of the function h(x) and the x-axis where h(x) = f(x)g(x).   For those of you familiar with calculus,

Expected value using integrals

In order to avoid using calculus, we will restrict ourselves to the discrete case in the rest of this chapter, although all the results shown here for discrete random variables extend to continuous random variables. Click here for more details about how to extend the results presented here to continuous distributions.

Property 1: For any random variables x and y and constant c

  1. E[c] = c
  2. E[cg(x)] = cE[g(x)]
  3. E[g(x) + h(x)] = E[g(x)] + E[h(x)]
  4. E[xy] = E[x] ∙ E[y] if x and y are independent

Proof: (a) – (c) are simple consequence of Definition 1. (d) is a consequence of Property 2 of Discrete Distributions.

Definition 2: If a random variable x has frequency function f(x) then the (population) mean μ of f(x) is defined as

Mean as expected value

Here the function g(x) in Definition 1 is the identity function g(x) = x.

The (population) variance σ2 is defined as

Variance as expected value

Property 2: The variance can also be expressed as

image272

Proof: By Property 1,

image273 image274

Property 3: For any random variable x and constants a and b

image275

image276

Proof: The first assertion is a consequence of Property 1, namely:

image277

For the second assertion, by Property 1 and 2, we have:

image278 image279 image280

Observation: It follows from Property 3 that for any constant b, Mean(b) = b and Var(b) = 0.

Property 4:

image283 image284

Proof: The first assertion follows from Property 1:

image285

For the second assertion, by Property 1 and 2

image286 image287 image288

But by Property 1d, E[xy] – E[x]E[y] = 0 since x and y are independent, and so

image290

Definition 3: For any random variable x with mean μ and standard deviation σ, the standardization z of x is defined by

Standardization

Property 5: The standardization of any random variable has mean 0 and variance 1.

Proof: Since
image5022

by Property 3 the mean of z is

image5020

and the variance of z is

Excel Function: Excel provides the following function for calculating the value of z from x, μ and σ:

STANDARDIZE(x, μ, σ) = (x – μ) / σ

Definition 4: The nth moment around the mean is defined as

nth moment around mean

Click here for more advanced information about moments and related subjects.

Observation: It follows from Definitions 2 and 4 that the variance can be expressed as

image303

In Symmetry and Kurtosis we define the skew and kurtosis of a sample. We now define the population equivalents of these concepts as follows.

Definition 5: The (population) skewness is defined as

Skewness formula

The (population) kurtosis is defined as

Kurtosis formula

Observation: The 3 in the kurtosis definition is the value of μ4/σ4 for the normal distribution function (see Normal Distribution). Thus the kurtosis of the normal distribution function is 0.

4 Responses to Expectation

  1. Harsha says:

    I was looking for the information on moments, the link given above is not working. Could please provide with the working link. Your explanation and derivation is very intuitive.

    Thanks Regards
    Harsha

  2. Mobb says:

    Hi,
    a little error here
    Sigma/Sigma^2 = 1
    To correct, Signa^2/Sigma^2 = 1
    This web site is awesome. It’s really useful providing many excel add-in functions.
    Thanks, Sir

    • Charles says:

      Hi Mobb,
      I am very pleased that you like the website.
      I also appreciate your identifying the typo. I have now corrected the error on the referenced webpage. Thanks for catching it.
      Charles

Leave a Reply

Your email address will not be published. Required fields are marked *