ET4072 Machine Learning and Deep Learning Syllabus:
ET4072 Machine Learning and Deep Learning Syllabus – Anna University PG Syllabus Regulation 2021
COURSE OBJECTIVES:
The course is aimed at
1. Understanding about the learning problem and algorithms
2. Providing insight about neural networks
3. Introducing the machine learning fundamentals and significance
4. Enabling the students to acquire knowledge about pattern recognition.
5. Motivating the students to apply deep learning algorithms for solving real life problems.
UNIT I LEARNING PROBLEMS AND ALGORITHMS
Various paradigms of learning problems, Supervised, Semi-supervised and Unsupervised algorithms
UNIT II NEURAL NETWORKS
Differences between Biological and Artificial Neural Networks – Typical Architecture, Common Activation Functions, Multi-layer neural network, Linear Separability, Hebb Net, Perceptron, Adaline, Standard Back propagation Training Algorithms for Pattern Association – Hebb rule and Delta rule, Hetero associative, Auto associative, Kohonen Self Organising Maps, Examples of Feature Maps, Learning Vector Quantization, Gradient descent, Boltzmann Machine Learning.
UNIT III MACHINE LEARNING – FUNDAMENTALS & FEATURE SELECTIONS & CLASSIFICATIONS
Classifying Samples: The confusion matrix, Accuracy, Precision, Recall, F1- Score, the curse of dimensionality, training, testing, validation, cross validation, overfitting, under-fitting the data, early stopping, regularization, bias and variance. Feature Selection, normalization, dimensionality reduction, Classifiers: KNN, SVM, Decision trees, Naïve Bayes, Binary classification, multi class classification, clustering.
UNIT IV DEEP LEARNING: CONVOLUTIONAL NEURAL NETWORKS
Feed forward networks, Activation functions, back propagation in CNN, optimizers, batch normalization, convolution layers, pooling layers, fully connected layers, dropout, Examples of CNNs.
UNIT V DEEP LEARNING: RNNS, AUTOENCODERS AND GANS
State, Structure of RNN Cell, LSTM and GRU, Time distributed layers, Generating Text, Autoencoders: Convolutional Autoencoders, Denoising autoencoders, Variational autoencoders, GANs: The discriminator, generator, DCGANs
COURSE OUTCOMES (CO):
At the end of the course the student will be able to
CO1 : Illustrate the categorization of machine learning algorithms.
CO2: Compare and contrast the types of neural network architectures, activation functions
CO3: Acquaint with the pattern association using neural networks
CO4: Elaborate various terminologies related with pattern recognition and architectures of convolutional neural networks
CO5: Construct different feature selection and classification techniques and advanced neural network architectures such as RNN, Autoencoders, and GANs.
REFERENCES:
1. J. S. R. Jang, C. T. Sun, E. Mizutani, Neuro Fuzzy and Soft Computing – A Computational Approach to Learning and Machine Intelligence, 2012, PHI learning
2. Deep Learning, Ian Good fellow, YoshuaBengio and Aaron Courville, MIT Press, ISBN: 9780262035613, 2016.
3. The Elements of Statistical Learning. Trevor Hastie, Robert Tibshirani and Jerome Friedman. Second Edition. 2009.
4. Pattern Recognition and Machine Learning. Christopher Bishop. Springer. 2006.
5. Understanding Machine Learning. Shai Shalev-Shwartz and Shai Ben-David. Cambridge University Press. 2017.