時間序列︰生成函數‧漸近展開︰無限大等級 IV

綻放的量天尺花,攝於夏威夷縣Kona

 

天有造父變星作標尺,地生量天尺花耐乾旱。

人創符號序無窮論次第︰

Family of Bachmann–Landau notations

Notation Name[12] Description Formal Definition Limit Definition[16][17][18][12][10]
f(n)=o(g(n)) Small O; Small Oh f is dominated by  g asymptotically   \forall k>0\;\exists n_{0}\;\forall n>n_{0}\;|f(n)|\leq k\cdot |g(n)| {\displaystyle \lim _{n\to \infty }{\frac {f(n)}{g(n)}}=0}
  f(n)=O(g(n)) Big O; Big Oh; Big Omicron   |f| is bounded above by  g (up to constant factor) asymptotically {\displaystyle \exists k>0\;\exists n_{0}\;\forall n>n_{0}\;|f(n)|\leq k\cdot g(n)} {\displaystyle \limsup _{n\to \infty }{\frac {\left|f(n)\right|}{g(n)}}<\infty }
f(n)=\Theta (g(n)) Big Theta   f is bounded both above and below by  g asymptotically   \exists k_{1}>0\;\exists k_{2}>0\;\exists n_{0}\;\forall n>n_{0}  k_{1}\cdot g(n)\leq f(n)\leq k_{2}\cdot g(n) f(n)=O(g(n)) and f(n)=\Omega (g(n)) (Knuth version)
f(n)\sim g(n)\! On the order of   f is equal to  g asymptotically \forall \varepsilon >0\;\exists n_{0}\;\forall n>n_{0}\;\left|{f(n) \over g(n)}-1\right|<\varepsilon {\displaystyle \lim _{n\to \infty }{f(n) \over g(n)}=1}
f(n)=\Omega (g(n)) Big Omega in number theory (Hardy-Littlewood)   |f| is not dominated by  g asymptotically   {\displaystyle \exists k>0\;\forall n_{0}\;\exists n>n_{0}\;|f(n)|\geq k\cdot g(n)} {\displaystyle \limsup _{n\to \infty }\left|{\frac {f(n)}{g(n)}}\right|>0}
f(n)=\Omega (g(n)) Big Omega in complexity theory (Knuth)   f is bounded below by  g asymptotically \exists k>0\;\exists n_{0}\;\forall n>n_{0}\;f(n)\geq k\cdot g(n) {\displaystyle \liminf _{n\to \infty }{\frac {f(n)}{g(n)}}>0}
f(n)=\omega (g(n)) Small Omega   f dominates  g asymptotically \forall k>0\;\exists n_{0}\;\forall n>n_{0}\ |f(n)|\geq k\cdot |g(n)| {\displaystyle \lim _{n\to \infty }\left|{\frac {f(n)}{g(n)}}\right|=\infty }

The limit definitions assume  {\displaystyle g(n)\neq 0} for sufficiently large  n. The table is sorted from smallest to largest, in the sense that o, O, Θ, ∼, both versions of Ω, ω on functions correspond to <, ≤, ≈, =, ≥, > on the real line.[18]:6

Computer science uses the big O, Big Theta Θ, little o, little omega ω and Knuth’s big Omega Ω notations.[19] Analytic number theory often uses the big O, small o, Hardy-Littlewood’s big Omega Ω and  \sim notations.[14] The small omega ω notation is not used as often in analysis.[20]

 

歷史點滴品味嚐︰

History (Bachmann–Landau, Hardy, and Vinogradov notations)

The symbol O was first introduced by number theorist Paul Bachmann in 1894, in the second volume of his book Analytische Zahlentheorie (“analytic number theory“), the first volume of which (not yet containing big O notation) was published in 1892.[1] The number theorist Edmund Landau adopted it, and was thus inspired to introduce in 1909 the notation o;[2] hence both are now called Landau symbols. These notations were used in applied mathematics during the 1950s for asymptotic analysis.[24] The big O was popularized in computer science by Donald Knuth, who re-introduced the related Omega and Theta notations.[12] Knuth also noted that the Omega notation had been introduced by Hardy and Littlewood[10] under a different meaning “≠o” (i.e. “is not an o of”), and proposed the above definition. Hardy and Littlewood’s original definition (which was also used in one paper by Landau[13]) is still used in number theory (where Knuth’s definition is never used). In fact, Landau also used in 1924, in the paper just mentioned, the symbols  \Omega _{R} (“right”) and  \Omega _{L} (“left”), which were introduced in 1918 by Hardy and Littlewood,[11] and which were precursors for the modern symbols  \Omega _{+} (“is not smaller than a small o of”) and  \Omega _{-} (“is not larger than a small o of”). Thus the Omega symbols (with their original meanings) are sometimes also referred to as “Landau symbols”.

Also, Landau never used the Big Theta and small omega symbols.

Hardy’s symbols were (in terms of the modern O notation)

{\displaystyle f\preccurlyeq g\iff f\in O(g)}   and    f\prec g\iff f\in o(g);

(Hardy however never defined or used the notation  \prec \!\!\prec , nor  \ll , as it has been sometimes reported). It should also be noted that Hardy introduces the symbols  \preccurlyeq and  \prec (as well as some other symbols) in his 1910 tract “Orders of Infinity”, and makes use of it only in three papers (1910–1913). In his nearly 400 remaining papers and books he consistently uses the Landau symbols O and o.

Hardy’s notation is not used anymore. On the other hand, in the 1930s,[25] the Russian number theorist Ivan Matveyevich Vinogradov introduced his notation  \ll , which has been increasingly used in number theory instead of the  O notation. We have

f\ll g\iff f\in O(g),

and frequently both notations are used in the same paper.

The big-O originally stands for “order of” (“Ordnung”, Bachmann 1894), and is thus a Latin letter. Neither Bachmann nor Landau ever call it “Omicron”. The symbol was much later on (1976) viewed by Knuth as a capital omicron,[12] probably in reference to his definition of the symbol Omega. The digit zero should not be used.

 

基本尺度細思量︰

\cdots \prec \ln(\ln(x)) \prec x^{\frac{1}{n}} \prec x \prec x^n \prec e^x \prec e^{e^x} \prec \cdots

此處 n 是大於一的自然數。

指數對數總其綱☆

倘知階乘座何處??

Rate of growth and approximations for large n

 Plot of the natural logarithm of the factorial

As n grows, the factorial n! increases faster than all polynomials and exponential functions (but slower than double exponential functions) in n.

Most approximations for n! are based on approximating its natural logarithm

{\displaystyle \ln n!=\sum _{x=1}^{n}\ln x.}

The graph of the function f(n) = ln n! is shown in the figure on the right. It looks approximately linear for all reasonable values of n, but this intuition is false. We get one of the simplest approximations for ln n! by bounding the sum with an integral from above and below as follows:

{\displaystyle \int _{1}^{n}\ln x\,dx\leq \sum _{x=1}^{n}\ln x\leq \int _{0}^{n}\ln(x+1)\,dx}

which gives us the estimate

{\displaystyle n\ln \left({\frac {n}{e}}\right)+1\leq \ln n!\leq (n+1)\ln \left({\frac {n+1}{e}}\right)+1.}

Hence ln ⁡ {\displaystyle \ln n!\sim n\ln n} (see Big O notation). This result plays a key role in the analysis of the computational complexity of sorting algorithms (see comparison sort). From the bounds on ln n! deduced above we get that

e\left({\frac {n}{e}}\right)^{n}\leq n!\leq e\left({\frac {n+1}{e}}\right)^{n+1}.

It is sometimes practical to use weaker but simpler estimates. Using the above formula it is easily shown that for all n we have  (n/3)^{n}<n!, and for all n ≥ 6 we have  n!<(n/2)^{n}.

For large n we get a better estimate for the number n! using Stirling’s approximation:

  n!\sim {\sqrt {2\pi n}}\left({\frac {n}{e}}\right)^{n}.

This in fact comes from an asymptotic series for the logarithm, and n factorial lies between this and the next approximation:

  {\sqrt {2\pi n}}\left({\frac {n}{e}}\right)^{n}<n!<{\sqrt {2\pi n}}\left({\frac {n}{e}}\right)^{n}e^{\frac {1}{12n}}.

Another approximation for ln n! is given by Srinivasa Ramanujan (Ramanujan 1988)

{\displaystyle \ln n!\approx n\ln n-n+{\frac {\ln(n(1+4n(1+2n)))}{6}}+{\frac {\ln(\pi )}{2}}}

or

n!\approx {\sqrt {2\pi n}}\left({\frac {n}{e}}\right)^{n}[1+1/(2n)+1/(8n^{2})]^{1/6}.

Both this and  {\sqrt {2\pi n}}\left({\frac {n}{e}}\right)^{n}e^{\frac {1}{12n}} give a relative error on the order of 1/n3, but Ramanujan’s is about four times more accurate. However, if we use two correction terms (as in Ramanujan’s approximation) the relative error will be of order 1/n5:

n!\approx {\sqrt {2\pi n}}\left({\frac {n}{e}}\right)^{n}\exp \left({{\frac {1}{12n}}-{\frac {1}{360n^{3}}}}\right)

【註︰ x^x = e^{x \ln(x)}  \prec e^{x^2}}

 

漸近之理已顯揚!!

一圖

 

一表

Orders of common functions

Here is a list of classes of functions that are commonly encountered when analyzing the running time of an algorithm. In each case, c is a positive constant and n increases without bound. The slower-growing functions are generally listed first.

Notation Name Example
  O(1) constant Determining if a binary number is even or odd; Calculating  (-1)^{n}; Using a constant-size lookup table
  O(\log \log n) double logarithmic Number of comparisons spent finding an item using interpolation search in a sorted array of uniformly distributed values
  O(\log n) logarithmic Finding an item in a sorted array with a binary search or a balanced search tree as well as all operations in a Binomial heap
  {\displaystyle O((\log n)^{c})}
{\displaystyle \scriptstyle c>1}
polylogarithmic Matrix chain ordering can be solved in polylogarithmic time on a Parallel Random Access Machine.
  O(n^{c})
{\displaystyle \scriptstyle 0<c<1}
fractional power Searching in a kd-tree
  O(n) linear Finding an item in an unsorted list or in an unsorted array; adding two n-bit integers by ripple carry
  {\displaystyle O(n\log ^{*}n)} n log-star n Performing triangulation of a simple polygon using Seidel’s algorithm, or the union–find algorithm. Note that \log ^{*}(n)={\begin{cases}0,&{\text{if }}n\leq 1\\1+\log ^{*}(\log n),&{\text{if }}n>1\end{cases}}
{\displaystyle O(n\log n)=O(\log n!)} linearithmic, loglinear, or quasilinear Performing a fast Fourier transform; Fastest possible comparison sort; heapsort and merge sort
  O(n^{2}) quadratic Multiplying two n-digit numbers by a simple algorithm; simple sorting algorithms, such as bubble sort, selection sort and insertion sort; bound on some usually faster sorting algorithms such as quicksort, Shellsort, and tree sort
  O(n^{c}) polynomial or algebraic Tree-adjoining grammar parsing; maximum matching for bipartite graphs; finding the determinant with LU decomposition
{\displaystyle L_{n}[\alpha ,c]=e^{(c+o(1))(\ln n)^{\alpha }(\ln \ln n)^{1-\alpha }}}
{\displaystyle \scriptstyle 0<\alpha <1}
L-notation or sub-exponential Factoring a number using the quadratic sieve or number field sieve
O(c^{n})
{\displaystyle \scriptstyle c>1}
exponential Finding the (exact) solution to the travelling salesman problem using dynamic programming; determining if two logical statements are equivalent using brute-force search
O(n!) factorial Solving the travelling salesman problem via brute-force search; generating all unrestricted permutations of a poset; finding the determinant with Laplace expansion; enumerating all partitions of a set

The statement {\displaystyle f(n)=O(n!)} is sometimes weakened to f(n)=O\left(n^{n}\right) to derive simpler formulas for asymptotic complexity. For any  k>0 and  c>0 O(n^{c}(\log n)^{k}) is a subset of  O(n^{c+\varepsilon }) for any  \varepsilon >0, so may be considered as a polynomial with some bigger order.

 

義自彰!!??