14-ma`ruza. Variance and Standard Deviation


The Geometric Distribution


Download 342.94 Kb.
bet2/3
Sana22.10.2023
Hajmi342.94 Kb.
#1715220
1   2   3
Bog'liq
14-ma`ruza.Variance and Standard Deviation

The Geometric Distribution






with
Let X be a random variable that describes the number of trials in a Bernoulli exper- iment that are needed until an event A, which occurs with p = P (A) > 0 in each trial, occurs for the first time. Then X has the distribution ∀x ∈ N : (x; P (X = x))
P (X = x) = gX(x; p) = p(1 − p)x1

A.3 Probability Theory 345

and is said to be geometrically distributed with parameter p. In order to compute the probabilities the recursive relation


x ∈ N : P (X = x + 1) = (1 − p)P(X = x) with P (X = 1) = p
can be useful. The expected value and variance are
μ = E(X) = 1 ; σ 2 = D2(X) = 1 p.
p p2


        1. The Hypergeometric Distribution






From an urn which contains M black and N M white, and thus in total N balls, n balls are drawn without replacement. Let X be the random variable that de- scribes the number of black balls that have been drawn. Then X has the distribution

M N M
x; max(0,n (N M)) x ≤ min(n, M) : (x; P (X = x)) with




x
P (X = x) = hX(x; n, M, N) =
nx

N
n

and is said to be hypergeometrically distributed with parameters n, M and N . This distribution satisfies the recursive relation
x; max(0,n (N M)) x ≤ min(n, M) :
h (x + 1; n, M, N) = (M x)(n x) h (x; n, M, N)

X



X

with hX


(x + 1)(N M n + x + 1)
M (1; n, M, N) = N .


N
With p = M and q = 1 − p, the expected value and variance are

N − 1
μ = E(X) = np; σ 2 = D2(X) = npq N n.


        1. The Poisson Distribution


A random variable X with the distribution ∀x ∈ N : (x; P (X = x)) where



λ λ
x
P (X = x) = ΛX(x; λ) = x! e
is said to be Poisson distributed20 with parameter λ. This distribution satisfies the recursive relation

x ∈ N0 : ΛX
λ (x + 1; λ) = x + 1
ΛX(x; λ) with ΛX
(0; λ) = eλ.



20This distribution bears its name in recognition of the French mathematician Siméon-Denis Pois- son (1781–1840).

346 A Statistics

The expected value and variance are


μ = E(X) = λ; σ 2 = D2(X) = λ.
A Poisson distribution describes the occurrence frequency of a certain type of events in a fixed duration of time, for example, the number of deadly traffic accidents.

λ λ
For rare events A and a large number n of trials, the binomial distribution can be approximated by a Poisson distribution, because the following relation holds: if in a binomial distribution, n goes to infinity so that np = λ stays constant, then


n→∞
x ∈ N : lim
np=λ
x
bX(x; p, n) = x! e ,

and thus for large n and small p, we obtain the approximation

x ∈ N : bX(x; p, n)
npx
e
x!
np.

For Poisson distributed random variables, the following reproduction law holds:


Theorem A.17 Let X and Y be two (stochastically) independent Poisson-distrib- uted random variables with parameters λX and λY , respectively. Then the sum Z = X + Y is also Poisson distributed with parameter λZ = λX + λY .


        1. The Uniform Distribution





A random variable X with the density function



fX(x; a, b) =
1
ba
for x ∈ [a, b],

0 otherwise,

∈ [ ]
with a, b R, a < b, is said to be uniformly distributed in a, b . Its distribution function FX is

xa ba
0 for x a,

FX(x; a, b) =

for a x b,



The expected value and variance are


1 for x b.

μ = E(X) =
a + b σ 2

;
2
= D2(X) =
(b a)2


12 .



        1. The Normal Distribution





A random variable X with the density function

2; = X
1
N x μ, σ e
2πσ 2
(x μ)2 2σ 2



is said to be normally distributed with parameters μ and σ 2.

    1. Probability Theory 347

The expected value and variance are


E(X) = μ; D2(X) = σ 2.

= =
The normal distribution with expected value μ 0 and variance σ 2 1 is called standard normal distribution.

= ±

;
The density function fX(x μ, σ 2) has its maximum at μ and inflection points at x μ σ . The distribution function of X does not possess a closed-form represen- tation. As a consequence, it is usually tabulated, most commonly for the standard normal distribution, from which the values of arbitrary normal distributions can be easily obtained by simple linear transformations.
However, in practice one often faces the reversed problem, namely to find the argument of the distribution function of the standard normal distribution for which is has a given value (or, in other words, one desires to find a quantile of the normal distribution). In order to solve this problem one may just as well use tabulated val- ues. However, the inverse function can be approximated fairly well by the ratio of two polynomials, which is usually employed in computer programs (e.g., [15]).
The normal distribution is certainly the most important continuous distribution, since many random processes, especially measurements of physical quantities, can be described well by this distribution. The theoretical justification for this observed fact is the important central limit theorem:



i
Theorem A.18 (central limit theorem) Let X1,..., Xm be m (stochastically) in- dependent real-valued random variables. In addition, let them satisfy the so-called Lindeberg condition, that is, if Fi(x) are the distribution functions of the random variables Xi , i = 1,..., m, μi their expected values, and σ 2 their variances, then

for every ε > 0, it is

lim
m→∞


m



1
V 2
m i=1


(xi μi)2



m


|xi μi |>εV 2

dFi(x) = 0



with V 2 = .m σ 2. Then the standardized sums


i=1
m i=1 i
.m (Xi μi)



Sm = ,.m 2
i=1 σi
(that is, standardized to expected value 0 and variance 1) satisfy

x ∈ R: lim
1 x


P (Sm x) = Φ(x) = √ e

2


t
2 dt,

m→∞
2π −∞

where Φ(x) is the distribution function of the standard normal distribution.

Intuitively this theorem says that the sum of a large number of almost arbitrarily distributed random variables (the Lindeberg condition is a very weak restriction) is approximately normally distributed. Since physical measurements are usually af- fected by a large number of random influences from several independent sources,



348 A Statistics

which all add up to form the total measurement error, the result is often approxi- mately normally distributed. The central limit theorem thus explains why normally distributed quantities are so common in practice.


Like the Poisson distribution, the normal distribution can be used as an approxi- mation of the binomial distribution, even if the probabilities p are not small.



Download 342.94 Kb.

Do'stlaringiz bilan baham:
1   2   3




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling