Chapter 6: Functions of Random Variables

6.5: The Method of Moment-Generating Functions

6.5: The Method of Moment-Generating Functions

Theorem 6.1

Let mX(t)m_X(t) and mY(t)m_Y(t) denote the moment-generating functions of random variables XX and YY, respectively. If both moment-generating functions exist and mX(t)=mY(t)m_X(t) = m_Y(t) for all values of tt, then XX and YY have the same probability distribution.

(The proof of Theorem 6.1 is beyond the scope of this text.)

Example 6.10

Suppose that YY is a normally distributed random variable with mean μ\mu variance σ2\sigma^2. Show that

Z=YμσZ = \frac{Y - \mu}{\sigma}

has a standard normal distribution, and normal distribution with mean 00 and variance 11.


Example 6.11

Let ZZ be a normally distributed random variable with mean 00 and variance 11. Use the method of moment-generating functions to find the probability distribution of Z2Z^2.


Theorem 6.2

Let Y1,Y2,...,YnY_1, Y_2, ..., Y_n be independent random variables with moment-generating functions mY1(t),mY2(t),...,mYn(t)m_{Y_1}(t), m_{Y_2}(t), ..., m_{Y_n}(t), respectively. If U=Y1+Y2+...+YnU = Y_1 + Y_2 + ... + Y_n, then

mU(t)=mY1(t)×mY2(t)×...×mYn(t).m_{U}(t) = m_{Y_1}(t) \times m_{Y_2}(t) \times ... \times m_{Y_n}(t).


Theorem 6.3

Let Y1,Y2,...,YnY_1, Y_2, ..., Y_n be independent normally distributed random variables with E(Yi)=μiE(Y_i) = \mu_i and V(Yi)=σi2V(Y_i) = \sigma_i^2, for i=1,2,...,ni = 1, 2, ..., n, and let a1,a2,...,ana_1, a_2, ..., a_n be constants. If

U=i=1naiYi=a1Y1+a2Y2+...+anYn,U = \sum_{i = 1}^n a_i Y_i = a_1 Y_1 + a_2 Y_2 + ... + a_n Y_n,

then UU is a normally distributed random variable with

E(U)=i=1naiμi=a1μ1+a2μ2+...+anμnE(U) = \sum_{i = 1}^n a_i \mu_i = a_1 \mu_1 + a_2 \mu_2 + ... + a_n \mu_n

and

V(U)=i=1naiσi2=a12σ12+a22σ22+...+an2σn2.V(U) = \sum_{i = 1}^n a_i \sigma_i^2 = a_1^2 \sigma_1^2 + a_2^2 \sigma_2^2 + ... + a_n^2 \sigma_n^2.


Theorem 6.4

Let Y1,Y2,...,YnY_1, Y_2, ..., Y_n be defined as in Theorem 6.3 and defined ZiZ_i by

Zi=Yiμiσi,i=1,2,...,n.Z_i = \frac{Y_i - \mu_i}{\sigma_i}, \quad i = 1, 2, ..., n.

Then i=1nZi2\sum_{i = 1}^n Z_i^2 has a χ2\chi^2 distribution with nn degrees of freedom.


Summary of the Moment-Generating Function Method

Let UU be a function of the random variables Y1,Y2,...,YnY_1, Y_2, ..., Y_n.

  1. Find the moment-generating function for UU, mU(t)m_U(t).
  2. Compare mU(t)m_U(t) with other well-known moment-generating functions. If mU(t)=mV(t)m_U(t) = m_V(t) for all values of tt, Theorem 6.1 implies that UU and VV have identical distributions.

Exercises

6.37

Let Y1,Y2,...,YnY_1, Y_2, ..., Y_n independent and identically distributed random variables such that for 0<p<10 < p < 1, P(Yi=1)=pP(Y_i = 1) = p and P(Yi=0)=q=1pP(Y_i = 0) = q = 1 - p. (Such random variables are called Bernoulli random variables.)

a) Find the moment-generating function for the Bernoulli random variable Y1Y_1.

b) Find the moment-generating function for W=Y1+Y2+...+YnW = Y_1 + Y_2 + ... + Y_n.

c) What is the distribution of WW?

6.38

Let Y1Y_1 and Y2Y_2 be independent random variables with moment-generating functions mY1(t)m_{Y_1}(t) and mY2(t)m_{Y_2}(t), respectively. If a1a_1 and a2a_2 are constants, and U=a1Y1+a2Y2U = a_1 Y_1 + a_2 Y_2 show that the moment-generating function for UU is mU(t)=mY1(a1t)×mY2(a2t)m_U(t) = m_{Y_1}(a_1 t) \times m_{Y_2}(a_2 t).

6.39

In Exercises 6.11 and 6.25, we considered two electronic components that operate independently, each with a life length governed by the exponential distribution with mean 11. Use the method of moment-generating functions to obtain the density function for the average life length of the two components.

6.40

Suppose that Y1Y_1 and Y2Y_2 are independent, standard normal random variables. Find the density function of U=Y12+Y22U = Y_1^2 + Y_2^2.

6.41

Let Y1,Y2,...,YnY_1, Y_2, ..., Y_n be independent, normal random variables, each with mean μ\mu and variance σ2\sigma^2. Let a1,a2,...,ana_1, a_2, ..., a_n denote known constants. Find the density function of the linear combination U=i=1naiYiU = \sum_{i = 1}^n a_i Y_i.

6.42

A type of elevator has a maximum weight capacity Y1Y_1, which is normally distributed with mean 50005000 pounds and standard deviation 300300 pounds. For a certain building equipped with this type of elevator, the elevator's load, Y2Y_2, is a normally distributed random variable with mean 40004000 pounds and standard deviation 400400 pounds. For any given time that the elevator is in use, find the probability that it will be overloaded, assuming that Y1Y_1 and Y2Y_2 are independent.

6.43

Refer to Exercise 6.41. Let Y1,Y2,...,YnY_1, Y_2, ..., Y_n be independent, normal random variables, each with mean μ\mu and variance σ2\sigma^2.

a) Find the density function of Y=1ni=1nYi\overline{Y} = \frac{1}{n} \sum_{i = 1}^n Y_i.

b) If σ2=16\sigma^2 = 16 and n=25n = 25, what is the probability that the sample mean, Y\overline{Y}, takes on a value that is within one unit of the population mean, μ\mu? That is, find P(Yμ1)P(|\overline{Y} - \mu| \leq 1).

c) If σ2=16\sigma^2 = 16, find P(Yμ1)P(|\overline{Y} - \mu| \leq 1) if n=36n = 36, n=64n = 64, and n=81n = 81. Interpret the results of your calculations.

6.44

The weight (in pounds) of "medium-size" watermelons is normally distributed with mean 1515 and variance 44. A packing container for several melons has a nominal capacity of 140140 pounds. What is the maximum number of melons that should be placed in a single packing container if the nominal weight limit is to be exceeded only 5%5\% of the time? Give reasons for your answer.

6.45

The manager of a construction job needs to figure prices carefully before submitting a bid. He also needs to account for uncertainty (variability) in the amounts of products he might need. To oversimplify the real situation, suppose that a project manager treats the amount of sand, in yards, needed for a construction project as a random variable Y1Y_1, which is normally distributed with mean 1010 yards and standard deviation .5.5 yard. The amount of cement mix needed, in hundreds of pounds, is a random variable Y2Y_2, which is normally distributed with mean 44 and standard deviation .2.2. The sand costs $77 per yard, and the cement mix costs $33 per hundred pounds. Adding $100100 for other costs, he computes his total cost to be

U=100+7Y1+3Y2.U = 100 + 7Y_1 + 3Y_2.

If Y1Y_1 and Y2Y_2 are independent, how much should the manager bid to ensure that the true costs will exceed the amount bid with a probability of only .01.01? Is the independence assumption reasonable here?

6.46

Suppose that YY has a gamma distribution with α=n/2\alpha = n / 2 for some positive integer nn and β\beta equal to some specified value. Use the method of moment-generating functions to show that W=2Y/βW = 2Y / \beta has a χ2\chi^2 distribution with nn degrees of freedom.

6.47

A random variable YY has a gamma distribution with α=3.5\alpha = 3.5 and β=4.2\beta = 4.2. Use the result in Exercise 6.46 and the percentage points for the χ2\chi^2 distributions given in Table 6, Appendix 3, to find P(Y>33.627)P(Y > 33.627).

6.48

In a missile-testing program, one random variable of interest is the distance between the point at which the missile lands and the center of the target at which the missile was aimed. If we think of the center of the target as the origin of a coordinate system, we can let Y1Y_1 denote the north-south distance between the landing point and the target center and let Y2Y_2 denote the corresponding east-west distance. (Assume that north and east define positive directions.) The distance between the landing point and the target center is then U=Y12+Y22U = \sqrt{Y_1^2 + Y_2^2}. If Y1Y_1 and Y2Y_2 are independent, standard normal random variables, find the probability density function for UU.

6.49

Let Y1Y_1 be a binomial random variable with n1n_1 trials and probability of success given by pp. Let Y2Y_2 be another binomial random variable with n2n_2 trials and probability of success also given by pp. If Y1Y_1 and Y2Y_2 are independent, find the probability function of Y1+Y2Y_1 + Y_2.

6.50

Let YY be a binomial random variable with nn trials and probability of success given by pp. Show that nYn - Y is a binomial random variable with nn trials and probability of success given by 1p1 - p.

6.51

Let Y1Y_1 be a binomial random variable with n1n_1 trials and p1=.2p_1 = .2 and Y2Y_2 be an independent binomial random variable with n2n_2 trials and p2=.8p_2 = .8. Find the probability function of Y1+n2Y2Y_1 + n_2 - Y_2.

6.52

Let Y1Y_1 and Y2Y_2 be independent Poisson random variables with means λ1\lambda_1 and λ2\lambda_2, respectively. Find the

a) probability function of Y1+Y2Y_1 + Y_2.

b) conditional probability function of Y1Y_1, given that Y1+Y2=mY_1 + Y_2 = m.

6.53

Let Y1,Y2,...,YnY_1, Y_2, ..., Y_n be independent binomial random variables with nin_i trials probability of success given by pip_i, i=1,2,...,ni = 1, 2, ..., n.

a) If all of the nin_i's are equal and all of the pip_i's are equal, find the distribution of i=1nYi\sum_{i = 1}^n Y_i.

b) If all of the nin_i's are different and all of the pip_i's are equal, find the distribution of i=1nYi\sum_{i = 1}^n Y_i.

c) If all of the nin_i's are different and all of the pip_i's are equal, find the conditional distribution of Y1Y_1 given i=1nYi=m\sum_{i = 1}^n Y_i = m.

d) If all of the nin_i's are different and all of the pip_i's are equal, find the conditional distribution of Y1+Y2Y_1 + Y_2 given i=1nYi=m\sum_{i = 1}^n Y_i = m.

e) If all of the pip_i's are different, does the method of moment-generating functions work well to find the distribution of i=1nYi\sum_{i = 1}^n Y_i? Why?

6.54

Let Y1,Y2,...,YnY_1, Y_2, ..., Y_n be independent Poisson random variables with means λ1,λ2,...,λn\lambda_1, \lambda_2, ..., \lambda_n, respectively. Find the

a) probability function of i=1nYi\sum_{i = 1}^n Y_i.

b) conditional probability function of Y1Y_1, given that i=1nYi=m\sum_{i = 1}^n Y_i = m.

c) conditional probability function of Y1+Y2Y_1 + Y_2, given that i=1nYi=m\sum_{i = 1}^n Y_i = m.

6.55

Customers arrive at a department store checkout counter according to a Poisson distribution with a mean of 77 per hour. In a given two-hour period, what is the probability that 2020 or more customers will arrive at the counter?

6.56

The length of time necessary to tune up a car is exponentially distributed with a mean of .5.5 hour. If two cars are waiting for a tune-up and the service times are independent, what is the probability that the total time for the two tune-ups will exceed 1.51.5 hours? [Hint: Recall the result of Example 6.12.]

6.57

Let Y1,Y2,...,YnY_1, Y_2, ..., Y_n be independent random variables such that each YiY_i has a gamma distribution with parameters αi\alpha_i and β\beta. That is, the distributions of the YiY_i's might have different α\alpha's, but all have the same value for β\beta. Prove that U=Y1+Y2+...+YnU = Y_1 + Y_2 + ... + Y_n has a gamma distribution with parameters α1+α2+...+αn\alpha_1 + \alpha_2 + ... + \alpha_n and β\beta.

6.58

We saw in Exercise 5.159 that the negative binomial random variable YY can be written as Y=i=1rWiY = \sum_{i = 1}^r W_i, where W1,W2,...,WrW_1, W_2, ..., W_r are independent geometric random variables with parameter pp.

a) Use this fact to derive the moment-generating function for YY.

b) Use the moment-generating function to show that E(Y)=r/pE(Y) = r / p and V(Y)=r(1p)/p2V(Y) = r(1 - p) / p^2.

c) Find the conditional probability function for W1W_1, given that Y=W1+W2+...+Wr=mY = W_1 + W_2 + ... + W_r = m.

6.59

Show that if Y1Y_1 has a χ2\chi^2 distribution with ν1\nu_1 degrees of freedom and Y2Y_2 has a χ2\chi^2 distribution with ν2\nu_2 degrees of freedom, then U=Y1+Y2U = Y_1 + Y_2 has a χ2\chi^2 distribution with ν1+ν2\nu_1 + \nu_2 degrees of freedom, provided that Y1Y_1 and Y2Y_2 are independent.

6.60

Suppose that W=Y1+Y2W = Y_1 + Y_2 where Y1Y_1 and Y2Y_2 are independent. If WW has a χ2\chi^2 distribution with ν\nu degrees of freedom and Y1Y_1 has a χ2\chi^2 distribution with ν1<ν\nu_1 < \nu degrees of freedom, show that Y2Y_2 has a χ2\chi^2 distribution with νν1\nu - \nu_1 degrees of freedom.

6.61

Refer to Exercise 6.52. Suppose that W=Y1+Y2W = Y_1 + Y_2 where Y1Y_1 and Y2Y_2 are independent. If WW has a Poisson distribution with mean λ\lambda and Y1Y_1 has a Poisson distribution with mean λ1<λ\lambda_1 < \lambda, show that Y2Y_2 has a Poisson distribution with mean λλ1\lambda - \lambda_1.

6.62

Let Y1Y_1 and Y2Y_2 be independent normal random variables, each with mean 00 and variance σ2\sigma^2. Define U1=Y1+Y2U_1 = Y_1 + Y_2 and U2=Y1Y2U_2 = Y_1 - Y_2. Show that U1U_1 and U2U_2 are independent normal random variables, each with mean 00 and variance 2σ22 \sigma^2. [Hint: If (U1,U2)(U_1 , U_2) has a joint moment-generating function m(t1,t2)m(t_1, t_2), then U1U_1 and U2U_2 are independent if and only if m(t1,t2)=mU1(t1)mU2(t2)m(t_1, t_2) = m_{U_1}(t_1) m_{U_2}(t_2).]