DS4003 Neural Networks and Applications Syllabus:
DS4003 Neural Networks and Applications Syllabus – Anna University PG Syllabus Regulation 2021
COURSE OBJECTIVES:
To introduce neural networks as means for computational learning.
To present the basic network architectures for classification and regression
To provide knowledge of computational and dynamical systems using neural networks,
To perform algorithmic training of various neural networks.
To understand training and limitations of learning self organizing systems
UNIT I BASIC LEARNING ALGORITHMS
Biological Neuron – Artificial Neural Model – Types of activation functions – Architecture: Feed forward and Feedback – Learning Process: Error Correction Learning –Memory Based Learning – Hebbian Learning – Competitive Learning – Boltzmann Learning – Supervised and Unsupervised Learning – Learning Tasks: Pattern Space – Weight Space – Pattern Association – Pattern Recognition – Function Approximation – Control – Filtering – Beamforming – Memory – Adaptation – Statistical Learning Theory – Single Layer Perceptron – Perceptron Learning Algorithm – Perceptron Convergence Theorem – Least Mean Square Learning Algorithm – Multilayer Perceptron – Back Propagation Algorithm – XOR problem – Limitations of Back Propagation Algorithm.
UNIT II RADIAL-BASIS FUNCTION NETWORKS AND SUPPORT VECTOR MACHINES
Cover’s Theorem on the Separability of Patterns – Exact Interpolator – Regularization Theory – Generalized Radial Basis Function Networks – Learning in Radial Basis Function Networks Applications: XOR Problem – Image Classification. Optimal Hyperplane for Linearly Separable Patterns and Nonseparable Patterns – Support Vector Machine for Pattern Recognition – XOR Problem -insensitive Loss Function –Support Vector Machines for Nonlinear Regression.
UNIT III COMMITTEE MACHINES AND NEURODYNAMICS SYSTEMS
Ensemble Averaging – Boosting – Associative Gaussian Mixture Model – Hierarchical Mixture of Experts Model(HME) – Model Selection using a Standard Decision Tree – A Priori and Posteriori Probabilities – Maximum Likelihood Estimation – Learning Strategies for the HME Model – EM Algorithm – Applications of EM Algorithm to HME Model. Dynamical Systems – Attractors and Stability – Non-linear Dynamical Systems- Lyapunov Stability – Neurodynamical Systems – The Cohen Grossberg T theorem.
UNIT IV ATTRACTOR NEURAL NETWORKS AND ADAPTIVE RESONANCE THEORY
Associative Learning – Attractor Neural Network Associative Memory – Linear Associative Memory – Hopfield Network – Content Addressable Memory – Strange Attractors and Chaos- Error Performance of Hopfield Networks – Applications of Hopfield Networks – Simulated Annealing – Boltzmann Machine – Bidirectional Associative Memory – BAM Stability Analysis – Error Correction in BAMs – Memory Annihilation of Structured Maps in BAMS – Continuous BAMs – Adaptive BAMs – Applications. Noise Saturation Dilemma – Solving Noise-Saturation Dilemma – Recurrent On-center – Off surround Networks – Building Blocks of Adaptive Resonance – Adaptive Resonance Theory – Applications.
UNIT V SELF ORGANISING MAPS AND PULSED NEURON MODELS
Self-organizing Map – Maximal Eigenvector Filtering – Sanger’s Rule – Generalized Learning Law – Competitive Learning – Vector Quantization – Mexican Hat Networks – Self – organizing Feature Maps – Applications. Spiking Neuron Model – Integrate-and-Fire Neurons – Conductance Based Models – Computing with Spiking Neurons.
TOTAL:45 PERIODS
COURSE OUTCOMES:
CO1:deduce the basic Computational Algorithms
CO2:explore mathematical based computational Algorithms
CO3:knowledge of computational and dynamical systems using neural networks,
CO4:perform algorithmic training of various neural networks and training of learning self organizing systems
CO5:understand Use different methods for the various applications
REFERENCES
1. James A. Freeman and David M. Skapura, “Neural Networks Algorithms, Applications, and Programming Techniques, Pearson Education (Singapore) Private Limited, Delhi,2003.
2. Martin T.Hagan, Howard B. Demuth, and Mark Beale, “Neural Network Design”, Thomson Learning, New Delhi, second edition 2014.
3. Satish Kumar, “Neural Networks: A Classroom Approach”, Tata McGraw-Hill Publishing Company Limited, New Delhi, 2017.
4. Simon Haykin, “Neural Networks: A Comprehensive Foundation”, 2ed., Addison Wesley Longman (Singapore) Private Limited, Delhi, 2001.