唐‧孟郊 《登科後》
昔日齷齪不足誇,
今朝放蕩思無涯。
春風得意馬蹄疾,
一日看盡長安花。
縱想暢寫生成花,高妙鉤沈難為功,祇得假借書自有,走馬看花總人情。
This book was originally published by Academic Press in 1990. A Second Edition appeared in 1994, and the Third Edition is now available from the publisher or from your favorite bookstore.
The Second Edition can be downloaded from this page. If you download the book you are agreeing to the following terms:
Copyright 1990, 1994 by Academic Press, Inc.;
Copyright 2004 by Herbert S. Wilf; Copyright 2006 by A K Peters, Ltd.
Reproduction of the downloaded version is permitted for any valid educational purpose of an institution of learning, in which case only the reasonable costs of reproduction may be charged. Reproduction for profit or for any commercial purposes is strictly prohibited. It is not permitted for a web site other than this one to offer the book directly for downloading. Other web sites are cordially invited to link to this page, but must not take the file and offer it themselves.
…
雞飛時起鴻鵠志,狗跳主迎舊日結,眼前大好勢所趨,豈敢獨闖陰陽界。
Probability-generating function
Definition
Univariate case
If X is a discrete random variable taking values in the non-negative integers {0,1, …}, then the probability generating function of X is defined as [1]
where p is the probability mass function of X. Note that the subscripted notations GX and pX are often used to emphasize that these pertain to a particular random variable X, and to its distribution. The power series converges absolutely at least for all complex numbers z with |z| ≤ 1; in many examples the radius of convergence is larger.
Multivariate case
If X = (X1,…,Xd ) is a discrete random variable taking values in the d-dimensional non-negative integer lattice {0,1, …}d, then the probability generating function of X is defined as
where p is the probability mass function of X. The power series converges absolutely at least for all complex vectors z = (z1,…,zd ) ∈ ℂd with max{|z1|,…,|zd |} ≤ 1.
Properties
Power series
Probability generating functions obey all the rules of power series with non-negative coefficients. In particular, G(1−) = 1, where G(1−) = limz→1G(z) from below, since the probabilities must sum to one. So the radius of convergence of any probability generating function must be at least 1, by Abel’s theorem for power series with non-negative coefficients.
Probabilities and expectations
The following properties allow the derivation of various basic quantities related to X:
1. The probability mass function of X is recovered by taking derivatives of G
2. It follows from Property 1 that if random variables X and Y have probability generating functions that are equal, GX = GY, then pX = pY. That is, if X and Y have identical probability generating functions, then they have identical distributions.
3. The normalization of the probability density function can be expressed in terms of the generating function by
The expectation of X is given by
More generally, the kth factorial moment, of X is given by
So the variance of X is given by
4. where X is a random variable, is the probability generating function (of X) and is the moment-generating function (of X) .
……
Moment-generating function
In probability theory and statistics, the moment-generating function of a real-valued random variable is an alternative specification of its probability distribution. Thus, it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or cumulative distribution functions. There are particularly simple results for the moment-generating functions of distributions defined by the weighted sums of random variables. Note, however, that not all random variables have moment-generating functions.
In addition to real-valued distributions (univariate distributions), moment-generating functions can be defined for vector- or matrix-valued random variables, and can even be extended to more general cases.
The moment-generating function of a real-valued distribution does not always exist, unlike the characteristic function. There are relations between the behavior of the moment-generating function of a distribution and properties of the distribution, such as the existence of moments.
Definition
In probability theory and statistics, the moment-generating function of a random variable X is
wherever this expectation exists. In other terms, the moment-generating function can be interpreted as the expectation of the random variable .
always exists and is equal to 1.
A key problem with moment-generating functions is that moments and the moment-generating function may not exist, as the integrals need not converge absolutely. By contrast, the characteristic function or Fourier transform always exists (because it is the integral of a bounded function on a space of finite measure), and for some purposes may be used instead.
More generally, where T, an n-dimensional random vector, and t a fixed vector, one uses instead of tX:
The reason for defining this function is that it can be used to find all the moments of the distribution.[1] The series expansion of etX is:
Hence:
where mn is the nth moment.
Differentiating MX(t) i times with respect to t and setting t = 0 we hence obtain the ith moment about the origin, mi, see Calculations of moments below.
───