教育和學習︰ Up《grade》【二】

然終究得放棄 PIXEL Desktop ,並非因 HDMI Audio 沒有聲音也。

 

主要的是 TensorFlow 出狀況哩!

(tensorflow) pi@raspberry:~ python3 Python 3.5.3 (default, Jan 19 2017, 14:11:04)  [GCC 6.3.0 20170118] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import tensorflow Traceback (most recent call last):   File "<stdin>", line 1, in <module>   File "/home/pi/tensorflow/lib/python3.5/site-packages/tensorflow/__init__.py", line 23, in <module>     from tensorflow.python import *   File "/home/pi/tensorflow/lib/python3.5/site-packages/tensorflow/python/__init__.py", line 49, in <module>     from tensorflow.python import pywrap_tensorflow   File "/home/pi/tensorflow/lib/python3.5/site-packages/tensorflow/python/pywrap_tensorflow.py", line 28, in <module>     _pywrap_tensorflow = swig_import_helper()   File "/home/pi/tensorflow/lib/python3.5/site-packages/tensorflow/python/pywrap_tensorflow.py", line 24, in swig_import_helper     _mod = imp.load_module('_pywrap_tensorflow', fp, pathname, description)   File "/home/pi/tensorflow/lib/python3.5/imp.py", line 242, in load_module     return load_dynamic(name, filename, file)   File "/home/pi/tensorflow/lib/python3.5/imp.py", line 342, in load_dynamic     return _load(spec) ImportError: /home/pi/tensorflow/lib/python3.5/site-packages/tensorflow/python/_pywrap_tensorflow.so: cannot open shared object file: No such file or directory >>>  (tensorflow) pi@raspberry:~ cd
(tensorflow) pi@raspberry:~ deactivate  pi@raspberry:~

 

就算 Up 後,空有4G 的 RAM ,七十五倍的 CPU POWER

pi@raspberrypi:~ dmesg | grep BogoMIPS [    0.000333] Calibrating delay loop (skipped), value calculated using timer frequency.. 38.40 BogoMIPS (lpj=192000) [    0.052227] SMP: Total of 4 processors activated (153.60 BogoMIPS). pi@raspberrypi:~

 

up@up-UP-CHT01:~sudo dmesg | grep BogoMIPS [sudo] password for up:  [    0.000043] Calibrating delay loop (skipped), value calculated using timer frequency.. 2880.00 BogoMIPS (lpj=5760000) [    0.133110] smpboot: Total of 4 processors activated (11520.00 BogoMIPS) up@up-UP-CHT01:~

 

在 AI 教育上亦乏學習園地呦?

故而擔心行還不行呀◎

俗話說︰人心不足蛇吞象。用以表達過份貪婪。據維基百科詞條講 ,語源出自《三海經》之『巴蛇食象』︰

山海經校注·海內南經

  (山海經第十·山海經海經新釋卷五)

15、巴蛇食象,三歲而出其骨,君子服之,無心腹之疾①。其為蛇青黃赤黑②。一曰黑蛇青首③,在犀牛西。

①  郭璞云:“今南方(虫丹)蛇(藏經本作蟒蛇——珂)吞鹿,鹿已爛,自絞於樹腹中,骨皆穿鱗甲間出,此其類也。楚詞曰:‘有蛇吞象,厥大何如?’說者云長 千尋。”郝懿行云:“今楚詞天問作‘一蛇吞象’,與郭所引異。王逸注引此經作‘靈蛇吞象’,並與今本異也。”珂案:淮南子本經篇云:“羿斷修蛇於洞庭。” 路史後紀十以“修蛇”作“長它 ”,羅苹注云:“長它即所謂巴蛇,在江岳間。其墓今巴陵之巴丘,在州治側。江源記(即江記,六朝宋庾仲雍撰 ——珂)云:‘羿屠巴蛇於洞庭,其骨若陵,曰巴陵也。’”岳陽風土記(宋范致明撰)亦云:“今巴蛇□在州院廳側,巍然而高,草木叢翳。兼有巴蛇廟,在岳陽 門內 。”又云:“象骨山。山海經云:‘巴蛇吞象。’暴其骨於此。山旁湖謂之象骨港。”是均從此經及淮南子附會而生出之神話。然而既有冢有廟,有山有港,言之確 鑿,則知傳播於民間亦已久矣。

② 珂案:言其文采斑爛也。

③  珂案:海內經云:“有巴遂山,澠水出焉。又有朱卷之國。有黑蛇,青首,食象。”即此。巴,小篆作□,說文十四云:“蟲也;或曰 :食象蛇。象 形。”則所象者,物在蛇腹彭亨之形。山海經多稱大蛇 ,如北山經云:“大咸之山,有蛇名曰長蛇,其毛如彘毫,其音如鼓柝。”北次三經云:“錞於毋逢之山,是 有大蛇,赤首白身,其音如牛,見則其邑大旱。”是可以“吞象”矣。水經注葉榆河云:“山多大蛇 ,名曰髯蛇,長十丈,圍七八尺,常在樹上伺鹿獸,鹿獸過,便低頭繞之。有頃鹿死,先濡令濕訖,便吞,頭角骨皆鑽皮出。山夷始見蛇不動時,便以大竹籤籤蛇頭 至尾,殺而食之,以為珍異。”即郭注所謂(虫丹)蛇也。

───

 

然而從《山海經》的說法『君子服之,無心腹之疾。』來看,一點也沒有貪婪的意思吧!郭璞認為『巴蛇』是『蟒蛇』一類。無怪乎『TensorFlow』能在二十行內完成『手寫阿拉伯數字』辨識程式︰

MNIST For ML Beginners

This tutorial is intended for readers who are new to both machine learning and TensorFlow. If you already know what MNIST is, and what softmax (multinomial logistic) regression is, you might prefer this faster paced tutorial. Be sure to install TensorFlow before starting either tutorial.

When one learns how to program, there’s a tradition that the first thing you do is print “Hello World.” Just like programming has Hello World, machine learning has MNIST.

MNIST is a simple computer vision dataset. It consists of images of handwritten digits like these:

It also includes labels for each image, telling us which digit it is. For example, the labels for the above images are 5, 0, 4, and 1.

In this tutorial, we’re going to train a model to look at images and predict what digits they are. Our goal isn’t to train a really elaborate model that achieves state-of-the-art performance — although we’ll give you code to do that later! — but rather to dip a toe into using TensorFlow. As such, we’re going to start with a very simple model, called a Softmax Regression.

The actual code for this tutorial is very short, and all the interesting stuff happens in just three lines. However, it is very important to understand the ideas behind it: both how TensorFlow works and the core machine learning concepts. Because of this, we are going to very carefully work through the code.

……

 

# The MNIST Data
from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets("MNIST_data/", one_hot=True)

# Implementing the Regression
import tensorflow as tf

x = tf.placeholder(tf.float32, [None, 784])

W = tf.Variable(tf.zeros([784, 10]))
b = tf.Variable(tf.zeros([10]))

y = tf.nn.softmax(tf.matmul(x, W) + b)

# Training
y_ = tf.placeholder(tf.float32, [None, 10])

cross_entropy = -tf.reduce_sum(y_*tf.log(y))

train_step = tf.train.GradientDescentOptimizer(0.01).minimize(cross_entropy)

init = tf.initialize_all_variables()

sess = tf.Session()
sess.run(init)

for i in range(1000):
  batch_xs, batch_ys = mnist.train.next_batch(100)
  sess.run(train_step, feed_dict={x: batch_xs, y_: batch_ys})

# Evaluating Our Model
correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))
accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
print(sess.run(accuracy, feed_dict={x: mnist.test.images, y_: mnist.test.labels}))



 

This should be about 91%.

Is that good? Well, not really. In fact, it’s pretty bad. This is because we’re using a very simple model. With some small changes, we can get to 97%. The best models can get to over 99.7% accuracy! (For more information, have a look at this list of results.)

What matters is that we learned from this model. Still, if you’re feeling a bit down about these results, check out the next tutorial where we do a lot better, and learn how to build more sophisticated models using TensorFlow!

─── 摘自 《W!o+ 的《小伶鼬工坊演義》︰巴蛇食象