Study notes on Probability Distribution Functions

By Himanshu Verma|Updated : November 5th, 2017

Distribution Functions and Discrete Random Variables

Distribution Functions

1. Cumulative Distribution Functions (cafe): FX(t) = P(X≤t), -∞ < t < ∞.

Example:

If a die is thrown repeatedly, let’s work out P(X≤t) for some values of t.

P(X≤1) is the probability that the number of throws until we get a 6 is less than or equal to 1. So it is either 0 or 1.

P(X=0) = 0 and P(X=1) = 1/6. Hence P(X≤1) = 1/6

Similarly, P(X≤2) = P(X=0) + P(X=1) + P(X=2) = 0 + 1/6 + 5/36 = 11/36

2. FX(t) is non-decreasing; 0≤FX(t)≤1; image001

If c<d, then FX(c)≤FX(d); P(c<Xd) = FX(d)-FX(c); P(X>c) = 1-FX(c).

The cdf of a discrete random variable: a step function.

Special Discrete Distributions

Bernoulli and Binomial Random Variables

1. Bernoulli trials: an experiment with two different possible outcomes

Bernoulli random variable X with parameter p, p is the probability of a success.

px(x) = px(1-p)1-x = pxq1-x, for xRX = {0,1}

Expected value: E[X] = μx = p; variance: image002

Example: If in a throw of a fair die the event of obtaining 4 or 6 is called a success, and the event of obtaining 1, 2, 3, or 5 is called a failure.

2. Binomial distribution: number of successes to occur in n repeated, independent Bernoulli trials

Binomial random variable Y with parameters n and p

image003

Expected value: E(Y) = μY = np; Variance: image004

Example: A restaurant serves 8 entrees of fish, 12 of beef, and 10 of poultry. If customers select from these entrees randomly, what is the probability that two of the next four customers order fish entrees?

Multinomial Random Variables

1. Multinomial trials: an experiment with k≥2 different possible outcomes

2. Multinomial distribution: n independent multinomial trials

Multinomial random variable X1, X2, …, Xk with parameters n, p1, p2, …pk; Xt: the number of ith outcomes; image005image006image007

image008

Example: Draw 15 balls with replacement from a box containing 20 red, 10 white, 30 black, and 50 green. What is the probability of 7R, 2W, 4B, 2G?

Geometric distribution: trial number of the first success to occur in a sequence of independent Bernoulli trials

Geometric random variable N with parameter p

image009

Geometric series: image010

Expected value: image011; variance: image012

Example: From an ordinary deck of 52 cards we draw cards at random, with replacement, and successively until an ace is drawn. What is the probability that at least 10 draws are needed?

Memory less property of geometric random variables: In successive independent Bernoulli trials, the probability that the next n outcomes are all failures does not change if we are given that the previous m successive outcomes were all failures.

PN(N>n+m|N>m)=PN(N>n)

Negative binomial distribution: trial number of the rth success to occur

Negative binomial random variable N, with parameters r, p

image013

Expected value: image014; variance: image015

Example: Sharon and Ann play a series of backgammon games until one of them wins five games. Suppose that the games are independent and the probability the Sharon wins a game is 0.58. Find the probability that the series ends in seven games.

Poisson Distribution

1. The Poisson probability function: image016

Poisson random variable K with parameter λ

Expected value: E(K) = μK = λ; variance: image018

2. he Poison approximation to the binomial: If X is a binomial random variable with parameters n and p=λ/n, then image019

Example (Application of the Poisson to the number of successes in Bernoulli Trials and the number of Arrivals in a time period): your record as a typist shows that you make an average of 3 mistakes per page. What is the probability that you make 10 mistakes on page 437?

3. Poisson processes

Example: Suppose that children are born at a Poisson rate of five per day in a certain hospital. What is the probability that at least two babies are born during the next six hours?

Continuous Random Variables

Probability Density Functions

1. Densities

2. Probability density function (pdf) for a continuous random variable X: fx(x)

fx(x)≥0

For all

image020

Example: Experience has shown that while walking in a certain park, the time X, in minutes, between seeing two people smoking has a density function of the form: fx(x) = λxe-x, x>0. (a) Calculate the value of λ. (b) Find the probability distribution function of X. (c) What is the probability that Jeff, who has just seen a person smoking, will see another person smoking in 2 to 5 minutes?

Cumulative Distribution Functions (cdf)

1. image021

2. The cdf of a discrete random variable: a step function; the cdf of a continuous random variable: a continuous function

3. The probability function of a discrete random variable: size of the jump in FX(t); the pdf of a continuous random variable: image022

Expectations and Variances

1. Definition: If X is a continuous random variable with pdf fx(x), the expected value of X is defined by image023.

Example: In a group of adult males, the difference between the uric acid value and 6, the standard value, is a random variable X with the following pdf: fx(x) = image024 (3x2 – 2x) if 2/3<x<3. Calculate the mean of these differences for the group.

2. Theorem: Let X be a continuous random variable with pdf fx(x); then for any function himage025.

3. Corollary: Let X be a continuous random variable with pdf fx(x). Let h1, h2, …, hn be real-valued functions and α1, α2, …, αn be real numbers. Then E[α1h1(X) + α2h2(X) + … + αnhn(X)] = α1E[h1(X)] + α2E[h2(X)] + … + αnE[hn(X)].

4. Definition: If X is a continuous random variable with E[X] = μx, then Var[X] and σx, called the variance and standard deviation of X, respectively, are defined by image026, image027.

Special Continuous Distributions

Uniform Random Variable

1. Density of a uniformly distributed random variable: image028

2. The cdf of a uniformly distributed random variable:

image029

3. The expected value and variance of the uniform random variable: image001; image031.

Example: Starting at 5:00 A.M., every half hour there is a flight from San Francisco airport to Los Angeles International airport. Suppose that none of these planes is completely sold out and that they always have room for passengers. A person who wants to fly to L.A. arrives at the airport at a random time between 8:45 A.M. and 9:45 A.M. Find the probability that she waits (a) at most 10 minutes; (b) at least 15 minutes.

The Exponential Distribution

1. The exponential probability law with parameter λ: image032

T1 is the time of occurrence of the first event in a Poisson process with parameter λ, starting at an arbitrary origin t=0; {Xλt = 0} ≡ {T1>t}, Poisson process Xλt is the number of events to occur with parameter μ=λt.

2. The expected value and variance of the exponential random variable: image033

Example: Suppose that every three months, on average, an earthquake occurs in California. What is the probability that the next earthquake occurs after three but before seven months?

3. Memory less feature of the exponential: P(T1≥a+b|T1≥b) = P(T1≥a), a, b are any two positive constants.

The Erlang Distribution

1. The cdf and pdf for T2image034

T2 Is the time of occurrence of the second event in a Poisson process with parameter λ, starting at an arbitrary origin t=0; {Xλt1} ≡ {T2>t}, Poisson process Xλt is the number of events to occur with parameter μ=λt?

2. The Erlang probability law with parameters r and λ: image035

Tr is the time of occurrence of the rth event in a Poisson process with parameter λ, starting at an arbitrary origin t=0; {Xλt ≤ r - 1} ≡ {Tr>t}, Xλt is the number of events to occur.

3. The expected value and variance of the Erlang random variable: image036

Example: Suppose that, on average, the number of β-particles emitted from a radioactive substance is four every second. What is the probability that it takes at least 2 seconds before the next two β-particles are emitted?

The Gamma Distribution

1. The gamma probability law with parameters n and λ: image037 for u>0, n>0, λ>0.

Gamma function: image038; Γ(n) = (n-1)Γ(n-1); Γ(n) = (n-1)!, if n is a positive integer.

The Erlang random variable is a particular case of a gamma random variable, where n is restricted to the integer values r=1,2,3,….

2. The expected value and variance of the gamma random variable: image039

The Normal (Gaussian) Distribution

1. The normal probability law with parameters μ and σ: image040, for -∞<x<∞,σ>0.

2. The expected value and variance of the normal random variable: image041.

3. Standard normal random variable (the unit normal): image042, μ=0, σ=1.

4. The new random variable: If X is normal with mean μ and variance σ2, then X+b is normal with mean μ+b and variance σ2; aX is normal with mean and variance a2σ2; aX+b is normal with mean aμ+b and variance a2σ2.

5. Switching from a non-unit normal to the unit normal: image043, μz=0, σz=1.

Example: Suppose that height X is normal with μ=66 inches and σ2 = 9. You can find the probability that a person picked at random is under 6 feet, using only the table for the standard normal.

6. The probability that a normal random variable is with k standard deviations of the mean: P(μ-kσ≤X≤μ+kσ) = P(-k≤X*≤k), X* is the unit normal.

7. The normal approximation to the binomial: A binomial distribution X with parameters n, p where n is large, then X is approximately normal with μ = np, σ2 = npq.

Example: A coin has P(heads) = 0.3. Toss the coin 1000 times so that the expected number of heads is 300. Find the probability that the number of heads is 400 or more.

Jointly Distributed Random Variables

Joint Densities

1. Jointly distributed random variable: If the observed values for two or more random variables are simultaneously determined by the same random mechanism.

2. The discrete random variables X1, X2: the probability function is image044 for all x1, x2; image045. The continuous random variables X1, X2: the pdf  is positive; image047 for all x1, x2; image048; the probability is given by integrating byjusexamprep.

3. The joint density of independent random variables: The random variables X, Y are independent if and only if pX,Y(x,y) = pX(x)pY(y), when X, Y are discrete; fX,Y(x,y) = fX(x)fY(y), when X, Y are continuous.

4. Uniform joint densities: If X and Y are jointly uniform on a region, the joint pdf is image049, for (x, y) in the region.

The joint density of independent uniform random variables: image050

Marginal Densities

1. The discrete random variables X1, X2 with the probability function image051 and range image052, the marginal probability function for X1 is image053 for image054 is the marginal range for X1, the set of first elements of image055.

2. The continuous random variables X1, X2 with pdf image046 and range byjusexamprep, the marginal pdf for X1 is image056 for byjusexamprep. is the marginal range for X1, the set of first elements of byjusexamprep.

Sums of Independent Random Variables

1. The sum of independent binomials with a common p: X1 with parameters n1, p, X2 with parameters n2, p, then X1+X2 with parameters n1+n2,p

2. The sum of independent Poissons: X1 with parameter λ1, X2 with parameters λ2, then X1+X2 with parameters λ1+λ2

3. The sum of independent exponentials with a common λ: X1, …, Xn with parameter λ, then X1 +…+ Xn has a gamma(Erlang) distribution with parameters n,λ.

4. The density of the sum of two arbitrary independent random variables: the convolution of the individual pdfimage057

5. The sum of independent normals: X1 with mean μ1 and variance image058 with mean μ2 and variance image059, then X1+X2 with mean μ12 and variance image060

Comments

write a comment

Follow us for latest updates