W!o+ 的《小伶鼬工坊演義》︰神經網絡【FFT】一

若說身在『影像處理』之領域不知道

快速傅立葉變換

快速傅立葉變換英語:Fast Fourier Transform, FFT),是計算序列的離散傅立葉變換(DFT)或其逆變換的一種演算法傅立葉分析將訊號從原始域(通常是時間或空間)轉換到頻域的表示或者逆過來轉換。FFT會通過把DFT矩陣分解稀疏(大多為零)因子之積來快速計算此類變換。[1] 因此,它能夠將計算DFT的複雜度從只用DFT定義計算需要的 O(n^2),降低到 O(n \log n),其中 n 為資料大小。

快速傅立葉變換廣泛的應用於工程、科學和數學領域。這裡的基本思想在1965年得到才普及,但早在1805年就已推匯出來。[2] 1994年吉爾伯特·斯特朗把FFT描述為「我們一生中最重要的數值演算法[3],它還被IEEE科學與工程計算期刊列入20世紀十大演算法。[4]

───

 

大概不可思議!!若問『手寫阿拉伯數字辨識』能不能用手寫數字 Spatial Domain 『空間域』來處理,誠是『大哉問』耶??

就讓我們略窺一下那個『空間域』的圖像︰

>>> import mnist_loader
>>> training_data, validation_data, test_data = \
... mnist_loader.load_data_wrapper()
>>> import network
>>> net = network.Network([784, 30, 10])
>>> npzfile = network.np.load("swb.npz")
>>> net.weights[0] = npzfile["w1"]
>>> net.weights[1] = npzfile["w2"]
>>> net.biases[0] = npzfile["b1"]
>>> net.biases[1] = npzfile["b2"]
>>> import matplotlib.pyplot as plt
>>> img = training_data[0][0].reshape(28,28)
>>> plt.imshow(img,cmap='Greys', interpolation='nearest')
<matplotlib.image.AxesImage object at 0x56e33d0>
>>> plt.show()
>>>

 

【5 之原圖】

Figure 5

 

 

>>> f_img = network.np.fft.fft2(img)
>>> sf_img = network.np.fft.fftshift(f_img)
>>> dbf_img = 20*network.np.log(network.np.abs(sf_img))
>>> plt.imshow(dbf_img, cmap='Greys', interpolation='nearest')
<matplotlib.image.AxesImage object at 0x570a150>
>>> plt.show()
>>>

 

【5 之 FFT db 頻譜】

Figure 5_fft_db

 

 

>>> phase_img = network.np.angle(f_img)
>>> plt.imshow(phase_img, cmap='Greys', interpolation='nearest')
<matplotlib.image.AxesImage object at 0x51bd690>
>>> plt.show()
>>> 

 

【5 之 FFT phase 頻譜】

Figure 5_fft_phase

 

 

>>> iphase_img = network.np.fft.ifft2(phase_img)
>>> iphase_img_p = network.np.abs(iphase_img)
>>> plt.imshow(iphase_img_p, cmap='Greys', interpolation='nearest')
<matplotlib.image.AxesImage object at 0x51c0d90>
>>> plt.show()
>>>

 

【單從相位頻譜重建】

Figure 5_phase_ifft

 

由於涉及『複數』 Complex number

複數,為實數的延伸,它使任一多項式方程式都有。複數當中有個「虛數單位i,它是-1的一個平方根,即i ^2 = -1。任一複數都可表達為x + yi,其中xy皆為實數,分別稱為複數之「實部」和「虛部」。

複數的發現源於三次方程的根的表達式。數學上,「複」字表明所討論的數體為複數,如複矩陣複變函數等。

───

 

那個『學習法則』該怎麼建立的呢?有興趣者或可以到此一遊︰

banner

Welcome

 

The Computational Intelligence Laboratory (CIL) is doing research in the areas of Complex-Valued Neural Networks and Intelligent Image Processing. The CIL is an integrated part of the College of Science, Technology, Engineering and Mathematics of Texas A&M University-Texarkana.

Our research on Complex-Valued Neural Networks is concentrated on the development of the Multi-Valued Neuron (MVN) and MVN-based neural networks paradigms.

Our research on Intelligent Image Processing is concentrated on applications of MVN-based neural networks in image processing and image recognition.

The Director of the Laboratory is Dr. Igor Aizenberg.

An NSF Grant Recipient in 2009-2012

dissolution

───

 

Complex-Valued Neurons

Complex-Valued Neural Networks

The primarily CIL research area is Complex-Valued Neural Networks (CVNNs), mainly Multi-Valued Neurons and neural networks based on them.

Complex-Valued Neural Networks become increasingly popular. The use of complex-valued inputs/outputs, weights and activation functions make it possible to increase the functionality of a single neuron and of a neural network, to improve their performance and to reduce the training time.

The history of complex numbers shows that although it took a long time for them to be accepted (almost 300 years from the first reference to “imaginary numbers” by Girolamo Cardano in 1545 to Leonard Euler’s and Carl Friedrich Gauss’ works published in 1748 and 1831, respectively), they have become an integral part of engineering and mathematics. It is difficult to imagine today how signal processing, aerodynamics, hydrodynamics, energy science, quantum mechanics, circuit analysis, and many other areas of engineering and science could develop without complex numbers. It is a fundamental mathematical fact that complex numbers are a necessary and absolutely natural part of numerical world. Their necessity clearly follows from the Fundamental Theorem of Algebra, which states that every non-constant single-variable polynomial of degree n with complex coefficients has exactly n complex roots, if each root is counted up to its multiplicity.

Answering a question frequently asked by some “conservative” people, what one can get using complex-valued neural networks (“twice more” parameters, more computations, etc.), we may say that one may get the same as using the Fourier transform, but not just the Walsh transform in signal processing. There are many engineering problems in the modern world where complex-valued signals and functions of complex variables are involved and where they are unavoidable. Thus, to employ neural networks for their analysis, approximation, etc., the use of complex-valued neural networks is natural. However, even in the analysis of real-valued signals (for example, images or audio signals) one of the most frequently used approaches is frequency domain analysis, which immediately leads us to the complex domain. In fact, analyzing signal properties in the frequency domain, we see that each signal is characterized by magnitude and phase that carry different information about the signal. This fundamental fact was deeply discovered by A.V. Oppenheim and J.S. Lim in their paper “The importance of phase in signals”, IEEE Proceedings, v. 69, No 5, 1981,pp.: 529- 541. They have shown that the phase in the Fourier spectrum of a signal is much more informative than the magnitude: particularly in the Fourier spectrum of images, just phase contains the information about all shapes, edges, orientation of all objects.

This property can be illustrated by the following example. Let us consider two popular test images �Lena� and �Bridge�.

 
Lena
 
Bridge

 

Let us take their Fourier transforms and then let us swap magnitude and phase of their Fourier spectra combining the phase of �Lena� with the magnitude of �Bridge� and wise-versa. After taking the inverse Fourier transform we clearly realize that those images were restored whose phases were combined with the counterpart magnitude:

 
Restored from Lena Phase + Bridge Magnitude
 
Restored from Bridge phase + Lena Magnitude

 

Thus, in fact, phase contains information of what is represented by the corresponding signal. To use this information properly, the most appropriate solution is movement to the complex domain. Hence, one of the most important characteristics of Complex-Valued Neural Networks is the proper treatment of amplitude and phase information, e.g., the treatment of wave-related phenomena such as electromagnetism, light waves, quantum waves and oscillatory phenomenon.

───

 

也可讀讀

RealvsComplex

https://www.elen.ucl.ac.be/Proceedings/esann/esannpdf/es2011-42.pdf

 

多點了解乎!!??