W!o+ 的《小伶鼬工坊演義》︰神經網絡【hyper-parameters】四

今天又是『五四』的了。不知那位『德』先生曾否來過?這位『賽』先生可曾長住??卻見世界烽煙不斷!『人道精神』正慾火鍛鍊中!!想起

《論語》‧學而

子貢曰:貧而無諂,富而無驕,何如?

子曰:可也。未若貧而樂,富而好禮者也。

子貢曰:《詩》云:『如切如磋,如琢如磨。』其斯之謂與?

子曰:賜也,始可與言詩已矣!告諸往而知來者。

,感嘆『貪、嗔、痴』果是『娑婆世界』之現象耶??!!

於此篇章之末,與其講 Michael Nielsen 先生做了個『總結』︰

Toward deep learning

While our neural network gives impressive performance, that performance is somewhat mysterious. The weights and biases in the network were discovered automatically. And that means we don’t immediately have an explanation of how the network does what it does. Can we find some way to understand the principles by which our network is classifying handwritten digits? And, given such principles, can we do better?

To put these questions more starkly, suppose that a few decades hence neural networks lead to artificial intelligence (AI). Will we understand how such intelligent networks work? Perhaps the networks will be opaque to us, with weights and biases we don’t understand, because they’ve been learned automatically. In the early days of AI research people hoped that the effort to build an AI would also help us understand the principles behind intelligence and, maybe, the functioning of the human brain. But perhaps the outcome will be that we end up understanding neither the brain nor how artificial intelligence works!

To address these questions, let’s think back to the interpretation of artificial neurons that I gave at the start of the chapter, as a means of weighing evidence. Suppose we want to determine whether an image shows a human face or not:

Credits: 1. Ester Inbar. 2. Unknown. 3. NASA, ESA, G. Illingworth, D. Magee, and P. Oesch (University of California, Santa Cruz), R. Bouwens (Leiden University), and the HUDF09 Team. Click on the images for more details.

……

The end result is a network which breaks down a very complicated question – does this image show a face or not – into very simple questions answerable at the level of single pixels. It does this through a series of many layers, with early layers answering very simple and specific questions about the input image, and later layers building up a hierarchy of ever more complex and abstract concepts. Networks with this kind of many-layer structure – two or more hidden layers – are called deep neural networks.

Of course, I haven’t said how to do this recursive decomposition into sub-networks. It certainly isn’t practical to hand-design the weights and biases in the network. Instead, we’d like to use learning algorithms so that the network can automatically learn the weights and biases – and thus, the hierarchy of concepts – from training data. Researchers in the 1980s and 1990s tried using stochastic gradient descent and backpropagation to train deep networks. Unfortunately, except for a few special architectures, they didn’t have much luck. The networks would learn, but very slowly, and in practice often too slowly to be useful.

Since 2006, a set of techniques has been developed that enable learning in deep neural nets. These deep learning techniques are based on stochastic gradient descent and backpropagation, but also introduce new ideas. These techniques have enabled much deeper (and larger) networks to be trained – people now routinely train networks with 5 to 10 hidden layers. And, it turns out that these perform far better on many problems than shallow neural networks, i.e., networks with just a single hidden layer. The reason, of course, is the ability of deep nets to build up a complex hierarchy of concepts. It’s a bit like the way conventional programming languages use modular design and ideas about abstraction to enable the creation of complex computer programs. Comparing a deep network to a shallow network is a bit like comparing a programming language with the ability to make function calls to a stripped down language with no ability to make such calls. Abstraction takes a different form in neural networks than it does in conventional programming, but it’s just as important.

───

 

不如說祇是個『勸學篇』︰

訊 ︰☿ 把酒飛斝是同道,欲法荀子《勸學篇》趁年少︰

君子曰:學不可以已。青,取之於藍而青於藍;冰,水為之而寒於水。以喻學則才過其本性也。木直中繩,輮以為輪,其曲中規,雖有槁暴,不復挺者,輮使之然也。輮,屈。槁,枯。曓,乾。挻,宜也。《晏子春秋》作「不復贏也」。故木受繩則直,金就礪則利,君子博學而日參省乎己,則知明而行無過矣。參,三也。曾子曰︰「日三省吾身。」知,讀爲智。行,下孟反。故不登高山,不知天之高也;不臨深谿,不知地之厚也;不聞先王之遺言,不知學問之大也。大,謂有益於人。干、越、夷、貉之子,生而同聲,長而異俗,教使之然也。干、越,猶言吳、越。《呂氏春秋》「荊有次非得寶劍於干、越」,高誘曰︰「吳邑也。」貉,東北夷。同聲,謂啼聲同。貉,莫革反。《詩》曰:「嗟爾君子,無恆安息。靖共爾位,好是正直。神之聽之,介爾景福。」《詩》,《小雅‧小明》之篇。靖,謀。介,助。景,大也。無恆安息,戒之不使懷安也。言能謀恭其位,好正宜之道,則神聽而助之福,引此詩以喻勤學也。神莫大於化道,福莫長於無禍。爲學則自化道,故神莫大焉。修身則自無禍,故福莫長焉。吾嘗終日而思矣,不如須臾之所學也,吾嘗跂而望矣,不如登高之博見也 。跂,舉足也。登高而招,臂非加長也,而見者遠;順風而呼,聲非加疾也,而聞者彰。假輿馬者,非利足也,而致千里;假舟楫者,非能水也,而絕江河。能,善。絶,過。君子生非異也,善假於物也。皆以喻修身在假於學。生非異,言與衆人同也。南方有鳥焉,名曰蒙鳩,以羽為巢而編之以髮,繫之葦苕,風至苕折,卵破子死。巢非不完也 ,所繫者然也。蒙 鳩,鷦鷯也。苕,葦之秀也,今巧婦鳥之巢至精密,多繫於葦竹之上是也。「蒙」當爲「蔑」。《方言》雲︰「鷦鷯,關而西謂之桑飛,或謂之蔑雀。」或曰︰一名 蒙鳩,亦以其愚也。言人不知學問,其所置身亦猶繫葦之危也。《說苑》︰「客謂孟嘗君曰︰『鷦鷯巢於葦苕,箸之髮毛,可謂完堅矣,大風至則苕折卵破子死者何 也?其所託者然也。』西方有木焉,名曰射干,莖長四寸,生於高山之上,而臨百仞之淵,木莖非能長也,所立者然也 。《本草》藥名有射干,一名烏扇。陶弘景雲︰「花白莖長,如射人之執竿。」又引阮公詩云「射干臨層城」,是生於高處也。據《本草》在《草部》中,又生南陽川穀,此雲「西方有木」,未詳。或曰︰「長四寸」卽是草,雲木,誤也。蓋生南陽,亦生西方也。射音夜。蓬生麻中,不扶而直。蘭槐之根是爲芷。其漸之滫,君子不近,庶人不服,其質非不美也,所漸者然也。蘭槐,香草,其根是爲芷也。《本草》︰「白芷一名白茝。」陶弘景雲︰「卽《離騷》所謂蘭茝也。」葢苗名蘭茝,根名芷也。弱槐當是蘭茝別名,故云「蘭槐之根是爲芷」也。漸,漬也,染也。滫,溺也。言雖香草,浸漬於溺中,則可惡也。漸,子廉反。滫,思酒反。故君子居必擇鄉,遊必就士,所以防邪僻而近中正也。物類之起,必有所始。榮辱之來,必象其德。肉腐出蟲,魚枯生蠹。怠慢忘身,禍災乃作。強自取柱,柔自取束。凡物強則以爲柱而任勞,柔則見束而約急,皆其自取也。邪穢在身,怨之所構。構,結也。言亦所自取。施薪若一,火就燥也;布薪於地,均若一,火就燥而焚之矣。平地若一,水就溼也。草木疇生,禽獸羣焉,物各從其類也。疇與儔同,類也。是故質的張而弓矢至焉,林木茂而斧斤至焉,所謂召禍也。質,射矦。的,正鵠也。樹成蔭而衆鳥息焉,醯酸而蜹聚焉。喻有德則慕之者衆。故言有召禍也,行有招辱也,君子慎其所立乎!禍福如此,不可不慎所立。所立,卽謂學也。

積土成山,風雨興焉;積水成淵,蛟龍生焉;積善成德,而神明自得,聖心備焉。神明自得,謂自通於神明。故不積蹞步,無以千里;半步曰蹞。蹞與跬同。不積小流,無以成江海。騏驥一躍,不能十步;駑馬十駕,言駑馬十度引車,則亦及騏驥之一躍。據下雲「駑馬十駕,則亦及之」,此亦當同,疑脫一句。功在不舍。鍥而舍之,朽木不折;鍥而不舍 ,金石可鏤。言立功在於不舍。舍與捨同。鍥,刻也,苦結反。《春秋傳》曰「陽虎借邑人之車,鍥其軸」也。螾無爪牙之利,筋骨之強,上食埃土,下飲黃泉,用心一也。螾與蚓同,蚯蚓也。蟹八跪而二螯,非虵蟺之穴無可寄託者,用心躁也。跪,足也。《韓子》以刖足爲刖跪。螫,蟹首上如鉞者。許叔重《說文》雲「蟹六足二螫」也。是故無冥冥之志者無昭昭之明 ,無惛惛之事者無赫赫之功。冥冥、惛惛,皆專默精誠之謂也。行衢道者不至,事兩君者不容。《爾雅》雲︰「四達謂之衢。」孫炎雲︰「衢,交道四出也。」或曰︰衢道,兩道也。不至,不能有所至。下篇有「楊朱哭衢塗」。今秦俗猶以兩爲衢,古之遺言歟?目不能兩視而明,耳不能兩聽而聰。螣蛇無足而飛,《爾雅》云:「螣,螣蛇。」郭璞雲「龍類,能興雲霧而遊其中」也。梧鼠五技而窮。「梧鼠」當爲「鼫鼠」,蓋本誤爲「鼯」字,傳寫又誤爲「梧」耳。技,才能也。言技能雖多而不能如螣蛇專一,故窮。五技,謂能飛不能上屋,能緣不能窮木,能游不能渡谷,能穴不能掩身,能走不能先人。《詩》曰 :「屍鳩在桑,其子七兮。淑人君子,其儀一兮。其儀一兮,心如結兮。」故君子結於一也。《詩》,《曹風‧屍鳩》之篇。毛雲︰「屍鳩,鴶鞠也。屍鳩之養七子,旦從上而下,暮從下而上,平均如一。善人君子,其執義亦當如屍鳩之一。執義一則用心堅固。」故曰「心如結」也。

─── 摘自《M♪o 之學習筆記本《編者跋》

 

而那個『應用之道』尚待『切磋琢磨』乎!!??

屠龍刀

屠龍刀

論語‧《陽貨

子之武城,聞弦歌之聲。夫子莞爾而笑,曰:「割雞焉用牛刀?」子游對曰:「昔者偃也聞諸夫子曰:『君子學道則愛人,小人學道則易使也。』」子曰:「二三子!偃之言是也。前言戲之耳。」

所 謂『相由心生』是說精神外顯的『形貌』從『用心方向』而來,這個『習焉不察』之內在『心相』,常可以用來分辨『行業』。一行有一行的規矩,百業有百業的訣 竅,入了行,從了業,自然帶有某種『氣息』的吧!如何才能夠不著『相』?若可『無所住』而生其『心』,那麼既無『我心』何來『我相』的呢!!

那麼這個《子之武城》一事,是否有個『前言』對上『後語』,可分出『對錯好壞』的呢?也許有個『禮樂』之『理』和『禮樂』之『用』的差別,想那『子游』為武城宰,採用『禮樂』教化之道,孔夫子卻『莞爾』笑,豈有不『以子之言,擊子之語』的哩!然而夫子所謂『戲之』果真是說『割雞焉用牛刀?』是錯了嗎?恐是不樂見『禮樂』被當作了『名器』的吧!就像到了宋代的『存天理,去人欲』,導致『死生事小,失節事大』,終演成『禮教殺人』之憾事 !!於是

祇『』這樣『』,不『』那樣『使』,終究難了『用大』之道 ── 無用而不通達 ── ,如如不動,應事而動,因事制宜。

正說著『以正治國』和『以奇用兵』,『為學之法』與『用學之法 』的不同,也須避免那『紙上談兵』之過。此事《孫子兵法

地形‧第十

孫子曰:地形者、有者、有者、有者、有者、有者 。我可以往,彼可以來,曰通。通形者, 先居高陽,利糧道,以戰則利。可以往,難以返,曰掛。掛形者,敵無備,出而勝之,敵若有備,出而不勝,難以返,不利。我出而不利,彼出而不利,曰支 。支形 者,敵雖利我,我無出也,引而去之,令敵半出而擊之,利 。隘形者,我先居之,必盈之以待敵。若敵先居之,盈而勿從,不盈而從之。險形者,我先居之,必居高 陽以待敵;若敵先居之,引而去之,勿從也。遠形者,勢均,難以挑戰,戰而不利。凡此六者 ,地之道也,將之至任,不可不察也。

者、有者、有者、有者、有者、有者。凡此六者,非天之災,將之過也。夫勢均,以一擊十,曰走;卒強吏弱,曰馳;吏強卒弱,曰陷;大吏怒而不服,遇敵懟 而自戰,將不知其能,曰崩;將弱不嚴,教道不明,吏卒無常,陳兵縱橫,曰亂;將不能料敵,以少合衆,以弱擊強,兵無選鋒,曰北。凡此六者,敗之道也,將之 至任,不可不察也。

夫地形者,兵之助也。料敵制勝,計險厄遠近,上將之道也。知此而用戰者必勝,不知此而用戰者必敗。故戰道必勝,主曰無戰,必戰可也;戰道不勝,主曰必戰,無戰可也。故進不求名,退不避罪 ,唯民是保,而利合於主,國之寶也。

視卒如嬰兒,故可以與之赴深溪;視卒如愛子,故可與之俱死 。厚而不能使,愛而不能令,亂而不能治,譬若驕子,不可用也。

知吾卒之可以擊,而不知敵之不可擊,勝之半也;知敵之可擊,而不知吾卒之不可以擊,勝之半也;知敵之可擊,知吾卒之可以擊,而不知地形之不可以戰,勝之半也。故知兵者,動而不迷,舉而不窮。故曰:知彼知己,勝乃不殆;知天知地,勝乃可全。

講的好。

─── 摘自《字詞網絡︰ WordNet 《六》 相 □ 而用 ○ !!