W!o+ 的《小伶鼬工坊演義》︰神經網絡【MNIST】二

固然我們可以用『純數學』或『抽象演算法』的方式來了解『手寫阿拉伯數字』辨識之『神經網絡』程式。不過要是能輔之以

資料視覺化

資料視覺化是關於資料之視覺表現形式的研究;其中,這種資料的視覺表現形式被定義為一種以某種概要形式抽提出來的資訊,包括相應資訊單位的各種屬性和變數[1]

……

歷史

資料視覺化領域的起源可以追溯到二十世紀50年代電腦圖形學的早期。當時,人們利用電腦建立出了首批圖形圖表。1987年,由布魯斯·麥考梅克湯姆斯·蒂凡提瑪克辛·布朗所編寫的美國國家科學基金會報告《Visualization in Scientific Computing》(意為「科學計算之中的視覺化」)[4],對於這一領域產生了大幅度的促進和刺激。這份報告之中強調了新的基於電腦視覺化技術方法的必要性。隨著電腦運算能力的迅速提升,人們建立了規模越來越大,複雜程度越來越高的數值模型,從而造就了形形色色體積龐大的數值型資料集。同時,人們不但利用醫學掃描器和顯微鏡之類的資料採集裝置產生大型的資料集,而且還利用可以儲存文字、數值和多媒體資訊的大型資料庫來收集資料。因而,就需要高階的電腦圖形學技術與方法來處理和視覺化這些規模龐大的資料集[3]

短語「Visualization in Scientific Computing」(意為「科學計算之中的視覺化」)後來變成了「Scientific Visualization」(即「科學視覺化」),而前者最初指的是作為科學計算之組成部分的視覺化:也就是科學與工程實踐當中對於電腦建模模擬的運用。更近一些的時候,視覺化也日益尤為關注資料,包括那些來自商業財務行政管理數字媒體等方面的大型異質性資料集合。二十世紀90年代初期,人們發起了一個新的,稱為「資訊視覺化」的研究領域,旨在為許多應用領域之中對於抽象的異質性資料集的分析工作提供支援。因此,目前人們正在逐漸接受這個同時涵蓋科學視覺化資訊視覺化領域的新生術語「資料視覺化」[3]

自那時起,資料視覺化就是一個處於不斷演變之中的概念,其邊界在不斷地擴大;因而,最好是對其加以寬泛的定義。資料視覺化指的是技術上較為高階的技術方法,而這些技術方法允許利用圖形、圖像處理電腦視覺以及使用者介面,通過表達、建模以及對立體、表面、屬性以及動畫的顯示,對資料加以視覺化解釋。與立體建模之類的特殊技術方法相比,資料視覺化所涵蓋的技術方法要廣泛得多[5]

1280px-Minard

法國工程師查爾斯·約瑟夫·密納德於1861年繪製的關於拿破崙入侵俄羅斯的資訊圖

───

 

難道不會更生動的嗎?或許還可以讓想像力飛昇耶??這就是所謂『一圖勝千言』之古義吧!!

只因短短文本無法解說『matplotlib』強大繪圖能力,也只能遁之以相關『範例教程』的了︰

matplotlib

Pyplot tutorial

matplotlib.pyplot is a collection of command style functions that make matplotlib work like MATLAB. Each pyplot function makes some change to a figure: e.g., creates a figure, creates a plotting area in a figure, plots some lines in a plotting area, decorates the plot with labels, etc. In matplotlib.pyplot various states are preserved across function calls, so that it keeps track of things like the current figure and plotting area, and the plotting functions are directed to the current axes (please note that “axes” here and in most places in the documentation refers to the axes part of a figure and not the strict mathematical term for more than one axis).

───

 

Image tutorial

Startup commands

First, let’s start IPython. It is a most excellent enhancement to the standard Python prompt, and it ties in especially well with Matplotlib. Start IPython either at a shell, or the IPython Notebook now.

With IPython started, we now need to connect to a GUI event loop. This tells IPython where (and how) to display plots. To connect to a GUI loop, execute the %matplotlib magic at your IPython prompt. There’s more detail on exactly what this does at IPython’s documentation on GUI event loops.

If you’re using IPython Notebook, the same commands are available, but people commonly use a specific argument to the %matplotlib magic:

───

 

就此展開『探索程式』、『程式探索』之旅乎!!??

假使按造 Michael Nielsen 所說的步驟執行︰

pi@raspberrypi ~ git clone https://github.com/mnielsen/neural-networks-and-deep-learning.git Cloning into 'neural-networks-and-deep-learning'... remote: Counting objects: 1141, done. remote: Total 1141 (delta 0), reused 0 (delta 0), pack-reused 1141 Receiving objects: 100% (1141/1141), 20.33 MiB | 4.30 MiB/s, done. Resolving deltas: 100% (574/574), done. Checking connectivity... done. pi@raspberrypi ~ 
pi@raspberrypi ~ cd neural-networks-and-deep-learning/ pi@raspberrypi ~/neural-networks-and-deep-learning cd src
pi@raspberrypi ~/neural-networks-and-deep-learning/src python Python 2.7.9 (default, Mar  8 2015, 00:52:26)  [GCC 4.9.2] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import mnist_loader >>> training_data, validation_data, test_data = \ ... mnist_loader.load_data_wrapper() >>> import network >>> net = network.Network([784, 30, 10]) >>> net.SGD(training_data, 30, 10, 3.0, test_data=test_data) Epoch 0: 9013 / 10000 Epoch 1: 9181 / 10000 Epoch 2: 9277 / 10000 Epoch 3: 9333 / 10000 Epoch 4: 9374 / 10000 Epoch 5: 9374 / 10000 Epoch 6: 9375 / 10000 Epoch 7: 9369 / 10000 Epoch 8: 9415 / 10000 Epoch 9: 9435 / 10000 Epoch 10: 9428 / 10000 Epoch 11: 9423 / 10000 Epoch 12: 9452 / 10000 Epoch 13: 9442 / 10000 Epoch 14: 9434 / 10000 Epoch 15: 9453 / 10000 Epoch 16: 9480 / 10000 Epoch 17: 9468 / 10000 Epoch 18: 9461 / 10000 Epoch 19: 9483 / 10000 Epoch 20: 9462 / 10000 Epoch 21: 9477 / 10000 Epoch 22: 9474 / 10000 Epoch 23: 9491 / 10000 Epoch 24: 9474 / 10000 Epoch 25: 9487 / 10000 Epoch 26: 9453 / 10000 Epoch 27: 9462 / 10000 Epoch 28: 9472 / 10000 Epoch 29: 9482 / 10000 >>>  </pre>    <span style="color: #666699;">大約經過五十分鐘後,在樹莓派 3 上可以訓練完成!或許是作者的運氣不好,最好的成績只有9491 / 10000$ 而已??

如果依據《易經》之『啟蒙』之說︰

蒙 ䷃ :亨。 匪我求童蒙,童蒙求我。 初噬告,再三瀆,瀆則不告。利貞。

彖曰:蒙,山下有險,險而止,蒙。 蒙亨,以亨行時中也。匪我求童蒙,童蒙求我,志應也。 初噬告,以剛中也。再三瀆, 瀆則不告,瀆蒙也。 蒙以養正,聖功也。

象曰:山下出泉,蒙﹔君子以果行育德。

,果應『再』乎!真不怕『三瀆』耶!!或可聽聽來知德這麼講︰

蒙卦 坎下艮上

蒙,昧也。其卦以坎遇艮,山下有險,艮山在外,坎水在內。水乃必行之物,遇山而止,內既險陷不安,外又行之不去,莫知所往,昏蒙之象也。《序卦》「屯者,物之始生也。物生必蒙,故受之以蒙」,所以次屯。

蒙,亨,匪我求童蒙,童蒙求我。初筮告,再三瀆,瀆則不吉,利貞。(告,古毒反)

蒙亨者,言蒙者亨也,不終于蒙也。匪我求童蒙二句,正理也。再,指四,陽一陰二,二再則四矣。三,指三。瀆者煩瀆也。初筮者,初筮下卦,得剛中也。比卦坎之剛中在上卦,故曰再筮。告者,二告乎五也。不告者,二不告乎三四也。凡陽則明,陰則暗,所以九二發六五之蒙。利貞者,教之以正也。

 

再筮得
Epoch 0: 9059 / 10000
Epoch 1: 9274 / 10000
Epoch 2: 9330 / 10000
Epoch 3: 9327 / 10000
Epoch 4: 9374 / 10000
Epoch 5: 9426 / 10000
Epoch 6: 9443 / 10000
Epoch 7: 9396 / 10000
Epoch 8: 9417 / 10000
Epoch 9: 9426 / 10000
Epoch 10: 9452 / 10000
Epoch 11: 9457 / 10000
Epoch 12: 9439 / 10000
Epoch 13: 9428 / 10000
Epoch 14: 9413 / 10000
Epoch 15: 9460 / 10000
Epoch 16: 9459 / 10000
Epoch 17: 9467 / 10000
Epoch 18: 9443 / 10000
Epoch 19: 9472 / 10000
Epoch 20: 9467 / 10000
Epoch 21: 9455 / 10000
Epoch 22: 9472 / 10000
Epoch 23: 9454 / 10000
Epoch 24: 9451 / 10000
Epoch 25: 9494 / 10000
Epoch 26: 9460 / 10000
Epoch 27: 9474 / 10000
Epoch 28: 9470 / 10000
Epoch 29: 9437 / 10000
>>> 

 

又何必在意那『結果』的呢?如以『隨機』之義而言,難到不該是『陰陽不測謂之神』的嗎??『大數平均值』方才是法則的吧 !!

That is, the trained network gives us a classification rate of about 95 percent - 95.42 percent at its peak ("Epoch 28")! That's quite encouraging as a first attempt. I should warn you, however, that if you run the code then your results are not necessarily going to be quite the same as mine, since we'll be initializing our network using (different) random weights and biases. To generate results in this chapter I've taken best-of-three runs.

─── Michael Nielsen

 

【練習題】

>>> import matplotlib.pyplot as plt
>>> img = training_data[0][0].reshape(28,28)
>>> plt.imshow(img)
<matplotlib.image.AxesImage object at 0x73f414f0>
>>> plt.show()

Figure 1_079

 

>>> plt.imshow(img, cmap='Greys', interpolation='nearest')
<matplotlib.image.AxesImage object at 0x740a5270>
>>> plt.show()

Figure 1_080

 

>>> len(training_data[0][0])
784
>>> len(training_data[0][1])
10
>>> net.feedforward(training_data[0][0])
array([[  8.31077720e-05],
       [  5.83811178e-08],
       [  3.52833999e-09],
       [  4.06550775e-05],
       [  1.80642282e-08],
       [  9.98592617e-01],
       [  1.64964140e-09],
       [  4.81507294e-06],
       [  6.47293120e-04],
       [  5.81232092e-07]])
>>> network.np.argmax(net.feedforward(training_data[0][0]))
5
>>>