1 / 12

FOANN: Fractional Order Artificial Neural Networks

FOANN: Fractional Order Artificial Neural Networks. YangQuan Chen Center for Self-Organizing and Intelligent Systems (CSOIS), Dept. of Electrical and Computer Engineering Utah State University E : yqchen@ieee.org; T : 1( 435)797-0148; F : 1(435)797-3054.

carol
Télécharger la présentation

FOANN: Fractional Order Artificial Neural Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. FOANN: Fractional Order Artificial Neural Networks YangQuan Chen Center for Self-Organizing and Intelligent Systems (CSOIS), Dept. of Electrical and Computer Engineering Utah State University E: yqchen@ieee.org; T: 1(435)797-0148; F: 1(435)797-3054 Wednesday, Feb. 06, 2008 – 4:00-4:30 PM SIG FOC Weekly Meeting http://mechatronics.ece.usu.edu/foc/yan.li/

  2. ANN – “Glorified static/dynamic data fitting” - YangQuan Chen http://en.wikipedia.org/wiki/Artificial_neural_network “ … glorified tablelook-up …” – from THOMAS H. KERR. “Critique of Some NeuralNetwork Architectures and Claims for Control and Estimation” IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS VOL. 34, NO. 2 APRIL 1998. pp. 406-419. http://ieeexplore.ieee.org/iel4/7/14739/00670323.pdf SIG FOC Weekly Meeting

  3. Dynamic ANN • http://www.seeingwithsound.com/thesis.htm# • Ph.D. thesis of Peter B.L. Meijer,  ``Neural Network Applications in Device and Subcircuit Modelling for Circuit Simulation'' (1.2MB PDF file). This thesis generalizes the multilayer perceptron networks and the associated backpropagation algorithm for analogue modeling of continuous and dynamic nonlinear multidimensional systems for simulation, using variable time step discretizations of continuous-time systems of coupled differential equations. A major advantage over conventional discrete-time recurrent neural networks with fixed time steps, as well as Kalman filters and time-delay neural network (TDNN) models with fixed time steps, is that the distribution of time steps is now arbitrary, allowing for smaller time steps during steep signal transitions for much better trade-offs between accuracy and CPU time, while there is also still freedom in the choice of time steps after the neural network model has been generated. In fact, multirate methods for solving differential equations can be readily applied. The use of second order differential equations for each neuron allows for complex oscillatory behaviours even in feedforward networks, while allowing for efficient mappings of differential-algebraic equations (DAEs) to a general neural network formalism. The resulting formalism represents a wide class of nonlinear and dynamic systems, including arbitrary nonlinear static systems, arbitrary quasi-static systems, and arbitrary lumped linear dynamical systems. SIG FOC Weekly Meeting

  4. http://www.mathworks.com/products/neuralnet/description3.htmlhttp://www.mathworks.com/products/neuralnet/description3.html Network Architectures • Supervised Networks: Supervised neural networks are trained to produce desired outputs in response to sample inputs, making them particularly well suited to modeling and controlling dynamic systems, classifying noisy data, and predicting future events. • Feedforward networks have one-way connections from input to output layers. They are most commonly used for prediction, pattern recognition, and nonlinear function fitting. Supported feedforward networks include feedforward backpropagation, cascade-forward backpropagation, feedforward input-delay backpropagation, linear, and perceptron networks. • Radial basis networks provide an alternative, fast method for designing nonlinear feedforward networks. Supported variations include generalized regression and probabilistic neural networks. • Dynamic networks use memory and recurrent feedback connections to recognize spatial and temporal patterns in data. They are commonly used for time-series prediction, nonlinear dynamic system modeling, and control system applications. Prebuilt dynamic networks in the toolbox include focused and distributed time-delay, nonlinear autoregressive (NARX), layer-recurrent, Elman, and Hopfield networks. The toolbox also supports dynamic training of custom networks with arbitrary connections. • LVQ is a powerful method for classifying patterns that are not linearly separable. LVQ lets you specify class boundaries and the granularity of classification. • Unsupervised Networks: Unsupervised neural networks are trained by letting the network continually adjust itself to new inputs. They find relationships within data and can automatically define classification schemes. • The Neural Network Toolbox supports two types of self-organizing, unsupervised networks: competitive layers and self-organizing maps. • Competitive layers recognize and group similar input vectors. By using these groups, the network automatically sorts the inputs into categories. • Self-organizing maps learn to classify input vectors according to similarity. Unlike competitive layers they also preserve the topology of the input vectors, assigning nearby inputs to nearby categories. SIG FOC Weekly Meeting

  5. Fractional Order applied to where?? • Neuron model – static – basis function. SIG FOC Weekly Meeting

  6. Hodgkin, A. L. and Huxley, A. F. (1952). A quantitative description of ion currents and its applications to conduction and excitation in nerve membranes. J. Physiol. (Lond.), 117:500-544. • The semipermeable cell membrane separates the interior of the cell from the extracellular liquid and acts as a capacitor. If an input current I(t) is injected into the cell, it may add further charge on the capacitor, or leak through the channels in the cell membrane. Because of active ion transport through the cell membrane, the ion concentration inside the cell is different from that in the extracellular liquid. The Nernst potential generated by the difference in ion concentration is represented by a battery. http://icwww.epfl.ch/~gerstner//SPNM/node14.html SIG FOC Weekly Meeting

  7. Remarks on “non-artificial NN” • `Real' neurons are extremely complex biophysical and biochemical entities. Before designing a model it is therefore necessary to develop an intuition for what is important and what can be savely neglected. The Hodgkin-Huxley model describes the generation of action potentials on the level of ion channels and ion current flow. It is the starting point for detailed neuron models which in general include more than the three types of currents considered by Hodgkin and Huxley. • Electrophysiologists have described an overwhelming richness of different ion channels. The set of ion channels is different from one neuron to the next. The precise channel configuration in each individual neuron determines a good deal of its overall electrical properties. Synapses are usually modeled as specific ion channels that open for a certain time after presynaptic spike arrival. • The geometry of the neuron can play an important role in synaptic integration because the effect of synaptic input on the somatic membrane potential depends on the location of the synapses on the dendritic tree. Though some analytic results can be obtained for passive dendrites, it is usually necessary to resort to numerical methods and multi-compartment models in order to account for complex geometry and active ion channels. http://icwww.epfl.ch/~gerstner//SPNM/node19.html SIG FOC Weekly Meeting

  8. Spiking neurons SIG FOC Weekly Meeting

  9. FO-ANN is an incremental research • Change the BP algorithm in FC manner • Change basic neuron dynamics as FC • Fractional coupling between neurons • … • So, what is more exciting? SIG FOC Weekly Meeting

  10. Origin of Complexity? • Where Medicine Went Wrong: Rediscovering the Path to Complexity Bruce J. West, World Scientific, 2006. ISBN: 9812568832, 337 pages, US$42. • http://ieeexplore.ieee.org/iel5/51/4268305/04272290.pdf?isnumber=4268305&prod=JNL&arnumber=4272290&arSt=10&ared=12&arAuthor=Magin%2C+R.L. SIG FOC Weekly Meeting

  11. The origin of complexity lies in fractional order dynamics, both spatial and temporal.- YangQuan Chen • He does show how physiological systems exhibit complexity by their many elements, time varying parameters, interdependent dynamics, nonlinear responses, loose coupling to the environment, composite order, partial randomness, short term stability, long term variability, and a wide range of scaling in time and space. Features of complexity that stand out are the fractal dimensions and allometric scaling often present in biological systems and organisms— particularly the success of the power law descriptions of complex behavior. West shows how variability in physiologic systems—as assessed by fractal time series analysis—falls in an intermediate region (as measured by fractal dimension) between that expected for regular periodic systems and that of a random aperiodic system. Hence, chaos, fractal scaling and fractal dimension are all evident. -- R. L. Magin (book review) SIG FOC Weekly Meeting

  12. Remarks on some papers relevant • Ivo Petras. A Note on the Fractional-Order Cellular Neural Networks. 2006 International Joint Conference on Neural Networks Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 2006 • Basically criticized the previous work on fractional order Chen circuit, etc. based on high order expansion of simulation is “nonsense”. • Simulation aspects are important. • Sam Gardner, Robbie Lamb, John Paxton. AN INITIAL INVESTIGATION ON THE USE OF FRACTIONAL CALCULUS WITH NEURAL NETWORKS. Proc. of the IASTED Int. Conf. on Comp. Intelligence, Nov. 2006, San Francisco, CA, USA. • Introduced FO in BP process. Ad hoc experience. Evidence of better performance but not sure why better and why “chaotic”. SIG FOC Weekly Meeting

More Related