STEM 隨筆︰古典力學︰轉子【五】《電路學》 五【電感】 IV‧馬達‧二‧E

300px-Focal_stability

430px-Pendulum_Phase_Portrait

Hamiltonian_flow_classical

340px-Limitcycle.svg

一個質量 m 物體,初始位置在 x_0,初始速度為 v_0 ,在 x 軸上運動,依據牛頓的第二運動定律 ,它的運動滿足一個二階微分方程式︰

\vec{F} = m \cdot \vec{a} = m \cdot \frac{d^2 x}{dt^2}

一般而言,除了一些特殊的力 \vec{F} 的形式,比方說簡諧運動之線性彈力 F = k \cdot x,微分方程式很難有『確解』,大概都得用『數值分析』的方式求解。那麼有沒有另一種運動描述辦法的呢?龐加萊和玻爾茲曼 Boltzmann 等人發展了『相空間』phase space 的想法,因為物體一旦給定了初始位置與初始速度── 一般使用動量 p = m \cdot v ──,它的運動軌跡就由牛頓的第二運動定律所確定,相空間是一個 (位置,動量) 所構成的座標系,這樣該物體的運動軌跡就畫出了相空間裡的一條線 ── 叫做相圖 phase diagram ──。一般這條曲線不會『自相交』,因為相交代表有不同的運動軌跡可以選擇,所以一旦相交會就只能是一種『週期運動』。龐加萊在研究三體問題的相圖時,卻發現只要『初始點』──  位置或動量 ──,極微小的變化,相圖就發生很大的改變,這種『敏感性』可能導致系統的『不可預測性』或是『不穩定性』。那我們的太陽系是穩定的嗎??

 

現今的混沌理論 Chaos theory 描述『非線性』系統在一定參數條件下會發生『分岔』 bifurcation 現象,周期運動與非周期運動可能相互『糾纏』,以至於通往某種非周期又可以有序之運動理論。因此它是一種兼具『定性』與『定量』的分析之思考方法,用以探討動力系統中無法僅用『一時單一』的數據,必須用『連續整體』的數據才能加以解釋或是描述該系統之行為。

220px-Double-compound-pendulum-dimensioned.svg

Double-compound-pendulum

220px-DPLE

Double_pendulum_flips_graph

混沌』chaos 一詞源自古希臘哲學家認為宇宙起源於混亂無序的狀態,逐漸由這個混沌之初形成現今有條不紊的世界。這個混沌論說︰

一切事物的初始狀態,都只是一些看似無關的碎片,然而當此混沌過程結束之時,這些碎片終自主有序的聚合成一個整體。

左圖演示一個『雙擺』Double pendulum 的運動,系統總能量取某些數量時它的運動是混沌的。假使想要對它『數值』求解,從『數值分析』的程式設計觀點來看這個數據『敏感性』問題 ── 叫做『惡劣條件』ill condition ──,通常需要作多次多種『收斂測試』,否則到底計算出來的是什麼,可就說不清的了。有興趣物理、數學或寫程式的讀者可以參考︰

Double Pendulum

Double Pendulum Demonstration

因為牛頓力學的『確定性』使然,法國數學家皮埃爾‧西蒙‧拉普拉斯 Pierre-Simon marquis de Laplace 曾經想像了一個『拉普拉斯的惡魔』Démon de Laplace 的妖怪,此物之智︰因能知道宇宙中每個原子確切的位置和動量,所以能夠使用牛頓定律來計算演示萬有事件的完整過程,無論是過去、現在以及未來。不知此妖遇到了混沌後,結果到底又是如何的呢?或許會……再組織……

貝纳德對流花紋

貝纳德對流方形容器

貝纳德對流圓形容器

自组織』現象是近代非線性科學和非平衡態熱力學研究中,最令人驚奇的一個發現︰

是一系統內部組織化的過程,通常是一開放系統,在沒有外部來源引導或管理之下會自行增加其複雜性。 自組織是從最初的無序系統中各部分之間的局部相互作用,產生某種全局有序或協調的形式的一種過程。這種過程是自發產生的,它不由任何中介或系統內部或外部 的子系統所主導或控制。

一九零零年法國貝納德 Henri Claude Bénard 於其博士論文展示了將薄層液面加熱,漸次產生了『對流花紋』。這是非線性自組織現象的第一個著名實驗。

 ─── 有序的事物將走向失序,

組織又從混沌中自生,

大自然總奧妙不斷。───

─── 《混沌理論

 

想不到所謂『定性分析』︰

Qualitative theory of differential equations

In mathematics, the qualitative theory of differential equations studies the behavior of differential equations by means other than finding their solutions. It originated from the works of Henri Poincaré and Aleksandr Lyapunov. There are relatively few differential equations that can be solved explicitly, but using tools from analysis and topology, one can “solve” them in the qualitative sense, obtaining many informations about their properties.[1]

 

並不容易!

縱使能『平衡』

Equilibrium point

In mathematics, specifically in differential equations, an equilibrium point is a constant solution to a differential equation.

Formal definition

The point \displaystyle {\tilde {\mathbf {x} }}\in \mathbb {R} ^{n} is an equilibrium point for the differential equation

\displaystyle {\frac {d\mathbf {x} }{dt}}=\mathbf {f} (t,\mathbf {x} )

if \displaystyle \mathbf {f} (t,{\tilde {\mathbf {x} }})=0 for all \displaystyle t

Similarly, the point \displaystyle {\tilde {\mathbf {x} }}\in \mathbb {R} ^{n} is an equilibrium point (or fixed point) for the difference equation

\displaystyle \mathbf {x} _{k+1}=\mathbf {f} (k,\mathbf {x} _{k})

if \displaystyle \mathbf {f} (k,{\tilde {\mathbf {x} }})={\tilde {\mathbf {x} }} for \displaystyle k=0,1,2,\ldots .

Classification

Equilibria can be classified by looking at the signs of the eigenvalues of the linearization of the equations about the equilibria. That is to say, by evaluating the Jacobian matrix at each of the equilibrium points of the system, and then finding the resulting eigenvalues, the equilibria can be categorized. Then the behavior of the system in the neighborhood of each equilibrium point can be qualitatively determined, (or even quantitatively determined, in some instances), by finding the eigenvector(s) associated with each eigenvalue.

An equilibrium point is hyperbolic if none of the eigenvalues have zero real part. If all eigenvalues have negative real part, the equilibrium is a stable equation. If at least one has a positive real part, the equilibrium is an unstable node. If at least one eigenvalue has negative real part and at least one has positive real part, the equilibrium is a saddle point.

 

※ 參照︰

承上篇,

v_a(t) = L_a \frac{d i_a}{dt} + R_a i_a + e

T_M = K \ i_e \ i_a = K_e \ i_a

e = K \ i_e \ \omega = K_e \ \omega

T_M - T_L = J \frac{d \omega}{dt} + F \ \omega

假設 \frac{d \omega}{dt} = 0 , \ \frac{d i_a}{dt} = 0 ,自能得出。

 

卻未必『穩定』☺☻

Stability theory

In mathematics, stability theory addresses the stability of solutions of differential equations and of trajectories of dynamical systems under small perturbations of initial conditions. The heat equation, for example, is a stable partial differential equation because small perturbations of initial data lead to small variations in temperature at a later time as a result of the maximum principle. In partial differential equations one may measure the distances between functions using Lp norms or the sup norm, while in differential geometry one may measure the distance between spaces using the Gromov–Hausdorff distance.

In dynamical systems, an orbit is called Lyapunov stable if the forward orbit of any point is in a small enough neighborhood or it stays in a small (but perhaps, larger) neighborhood. Various criteria have been developed to prove stability or instability of an orbit. Under favorable circumstances, the question may be reduced to a well-studied problem involving eigenvalues of matrices. A more general method involves Lyapunov functions. In practice, any one of a number of different stability criteria are applied.

Stability Diagram

Overview in dynamical systems

Many parts of the qualitative theory of differential equations and dynamical systems deal with asymptotic properties of solutions and the trajectories—what happens with the system after a long period of time. The simplest kind of behavior is exhibited by equilibrium points, or fixed points, and by periodic orbits. If a particular orbit is well understood, it is natural to ask next whether a small change in the initial condition will lead to similar behavior. Stability theory addresses the following questions: Will a nearby orbit indefinitely stay close to a given orbit? Will it converge to the given orbit? (The latter is a stronger property.) In the former case, the orbit is called stable; in the latter case, it is called asymptotically stable and the given orbit is said to be attracting.

An equilibrium solution \displaystyle f_{e} to an autonomous system of first order ordinary differential equations is called:

  • stable if for every (small) \displaystyle \epsilon >0 , there exists a \displaystyle \delta >0 such that every solution \displaystyle f(t) having initial conditions within distance \displaystyle \delta i.e. \displaystyle \|f(t_{0})-f_{e}\|<\delta of the equilibrium remains within distance \displaystyle \epsilon i.e. \displaystyle \|f(t)-f_{e}\|<\epsilon for all \displaystyle t\geq t_{0} .
  • asymptotically stable if it is stable and, in addition, there exists \displaystyle \delta _{0}>0 such that whenever \displaystyle \delta _{0}>\|f(t_{0})-f_{e}\| then \displaystyle f(t)\rightarrow f_{e} as \displaystyle t\rightarrow \infty .

Stability means that the trajectories do not change too much under small perturbations. The opposite situation, where a nearby orbit is getting repelled from the given orbit, is also of interest. In general, perturbing the initial state in some directions results in the trajectory asymptotically approaching the given one and in other directions to the trajectory getting away from it. There may also be directions for which the behavior of the perturbed orbit is more complicated (neither converging nor escaping completely), and then stability theory does not give sufficient information about the dynamics.

One of the key ideas in stability theory is that the qualitative behavior of an orbit under perturbations can be analyzed using the linearization of the system near the orbit. In particular, at each equilibrium of a smooth dynamical system with an n-dimensional phase space, there is a certain n×n matrix A whose eigenvalues characterize the behavior of the nearby points (Hartman–Grobman theorem). More precisely, if all eigenvalues are negative real numbers or complex numbers with negative real parts then the point is a stable attracting fixed point, and the nearby points converge to it at an exponential rate, cf Lyapunov stability and exponential stability. If none of the eigenvalues are purely imaginary (or zero) then the attracting and repelling directions are related to the eigenspaces of the matrix A with eigenvalues whose real part is negative and, respectively, positive. Analogous statements are known for perturbations of more complicated orbits.

Stability of fixed points

The simplest kind of an orbit is a fixed point, or an equilibrium. If a mechanical system is in a stable equilibrium state then a small push will result in a localized motion, for example, small oscillations as in the case of a pendulum. In a system with damping, a stable equilibrium state is moreover asymptotically stable. On the other hand, for an unstable equilibrium, such as a ball resting on a top of a hill, certain small pushes will result in a motion with a large amplitude that may or may not converge to the original state.

There are useful tests of stability for the case of a linear system. Stability of a nonlinear system can often be inferred from the stability of its linearization.

Maps

Let f: RR be a continuously differentiable function with a fixed point a, f(a) = a. Consider the dynamical system obtained by iterating the function f:

\displaystyle x_{n+1}=f(x_{n}),\quad n=0,1,2,\ldots .

The fixed point a is stable if the absolute value of the derivative of f at a is strictly less than 1, and unstable if it is strictly greater than 1. This is because near the point a, the function f has a linear approximation with slope f’(a):

\displaystyle f(x)\approx f(a)+f'(a)(x-a).

Thus

\displaystyle {\frac {x_{n+1}- f(a)}{x_{n}-a}}={\frac {f(x_{n})- f(a)}{x_{n}-a}}\approx {\frac {f'(a)(x_{n}-a)}{x_{n}-a}}=f'(a),

which means that the derivative measures the rate at which the successive iterates approach the fixed point a or diverge from it. If the derivative at a is exactly 1 or −1, then more information is needed in order to decide stability.

There is an analogous criterion for a continuously differentiable map f: RnRn with a fixed point a, expressed in terms of its Jacobian matrix at a, Ja(f). If all eigenvalues of J are real or complex numbers with absolute value strictly less than 1 then a is a stable fixed point; if at least one of them has absolute value strictly greater than 1 then a is unstable. Just as for n=1, the case of the largest absolute value being 1 needs to be investigated further — the Jacobian matrix test is inconclusive. The same criterion holds more generally for diffeomorphisms of a smooth manifold.

Linear autonomous systems

The stability of fixed points of a system of constant coefficient linear differential equations of first order can be analyzed using the eigenvalues of the corresponding matrix.

An autonomous system

\displaystyle x'=Ax,

where x(t) ∈ Rn and A is an n×n matrix with real entries, has a constant solution

\displaystyle x(t)=0.

(In a different language, the origin 0 ∈ Rn is an equilibrium point of the corresponding dynamical system.) This solution is asymptotically stable as t → ∞ (“in the future”) if and only if for all eigenvalues λ of A, Re(λ) < 0. Similarly, it is asymptotically stable as t → −∞(“in the past”) if and only if for all eigenvalues λ of A, Re(λ) > 0. If there exists an eigenvalue λ of A with Re(λ) > 0 then the solution is unstable for t → ∞.

Application of this result in practice, in order to decide the stability of the origin for a linear system, is facilitated by the Routh–Hurwitz stability criterion. The eigenvalues of a matrix are the roots of its characteristic polynomial. A polynomial in one variable with real coefficients is called a Hurwitz polynomial if the real parts of all roots are strictly negative. The Routh–Hurwitz theorem implies a characterization of Hurwitz polynomials by means of an algorithm that avoids computing the roots.

Non-linear autonomous systems

Asymptotic stability of fixed points of a non-linear system can often be established using the Hartman–Grobman theorem.

Suppose that v is a C1vector field in Rn which vanishes at a point p, v(p) = 0. Then the corresponding autonomous system

\displaystyle x'=v(x)

has a constant solution

\displaystyle x(t)=p.

Let Jp(v) be the n×n Jacobian matrix of the vector field v at the point p. If all eigenvalues of J have strictly negative real part then the solution is asymptotically stable. This condition can be tested using the Routh–Hurwitz criterion.

 

又豈可忘卻『振動』★☆

Periodic point

In mathematics, in the study of iterated functions and dynamical systems, a periodic point of a function is a point which the system returns to after a certain number of function iterations or a certain amount of time.

Iterated functions

Given an endomorphism f on a set X

\displaystyle f:X\to X

a point x in X is called periodic point if there exists an n so that

\displaystyle \ f_{n}(x)=x

where \displaystyle f_{n} is the nth iterate of f. The smallest positive integer n satisfying the above is called the prime period or least period of the point x. If every point in X is a periodic point with the same period n, then f is called periodic with period n.

If there exist distinct n and m such that

\displaystyle f_{n}(x)=f_{m}(x)

then x is called a preperiodic point. All periodic points are preperiodic.

If f is a diffeomorphism of a differentiable manifold, so that the derivative \displaystyle f_{n}^{\prime } is defined, then one says that a periodic point is hyperbolic if

\displaystyle |f_{n}^{\prime }|\neq 1,

that it is attractive if

\displaystyle |f_{n}^{\prime }|<1,

and it is repelling if

\displaystyle |f_{n}^{\prime }|>1.

If the dimension of the stable manifold of a periodic point or fixed point is zero, the point is called a source; if the dimension of its unstable manifold is zero, it is called a sink; and if both the stable and unstable manifold have nonzero dimension, it is called a saddle or saddle point.