MC4301 Machine Learning Syllabus:

MC4301 Machine Learning Syllabus – Anna University PG Syllabus Regulation 2021

COURSE OBJECTIVES:

 To gain knowledge on foundations of machine learning and apply suitable dimensionality reduction techniques for an application
 To select the appropriate model and use feature engineering techniques
 To gain knowledge on Probability and Bayesian Learning to solve the given problem
 To design and implement the machine learning techniques for real world problems
 To analyze, learn and classify complex data without predefined models also

UNIT I INTRODUCTION

Human Learning – Types – Machine Learning – Types – Problems not to be solved – Applications – Languages/Tools– Issues. Preparing to Model: Introduction – Machine Learning Activities – Types of data – Exploring structure of data – Data quality and remediation – Data Pre-processing

UNIT II MODEL EVALUATION AND FEATURE ENGINEERING

Model Selection – Training Model – Model Representation and Interpretability – Evaluating Performance of a Model – Improving Performance of a Model – Feature Engineering: Feature Transformation – Feature Subset Selection

UNIT III BAYESIAN LEARNING

Basic Probability Notation- Inference – Independence – Bayes’ Rule. Bayesian Learning: Maximum Likelihood and Least Squared error hypothesis-Maximum Likelihood hypotheses for predicting probabilities- Minimum description Length principle -Bayes optimal classifier – Naïve Bayes classifier – Bayesian Belief networks -EM algorithm.

UNIT VI PARAMETRIC MACHINE LEARNING

Logistic Regression: Classification and representation – Cost function – Gradient descent – Advanced optimization – Regularization – Solving the problems on overfitting. Perceptron – Neural Networks – Multi – class Classification – Backpropagation – Non-linearity with activation functions (Tanh, Sigmoid, Relu, PRelu) – Dropout as regularization

UNIT V NON PARAMETRIC MACHINE LEARNING

k- Nearest Neighbors- Decision Trees – Branching – Greedy Algorithm – Multiple Branches – Continuous attributes – Pruning. Random Forests: ensemble learning. Boosting – Adaboost algorithm. Support Vector Machines – Large Margin Intuition – Loss Function – Hinge Loss – SVM Kernels

SUGGESTED ACTIVITIES:

1. Explore the significant steps involved in data preprocessing in Machine Learning
2. Choose a model and train a model in machine learning.
3. Explain the application of Bayes Theorem and how it’s useful to predict the future
4. Make the difference between supervised Learning and unsupervised Learning Techniques
5. Differentiate Perceptron, Neural Network, Convolutional Neural Network and Deep Learning

TOTAL:45 PERIODS

COURSE OUTCOMES:

CO1:Understand about Data Preprocessing, Dimensionality reduction
CO2:Apply proper model for the given problem and use feature engineering techniques
CO3:Make use of Probability Technique to solve the given problem.
CO4:Analyze the working model and features of Decision tree
CO5:choose and apply appropriate algorithm to learn and classify the data

REFERENCES

1. Ethem Alpaydin, “Introduction to Machine Learning 3e (Adaptive Computation and Machine Learning Series)”, Third Edition, MIT Press, 2014
2. Tom M. Mitchell, “Machine Learning”, India Edition, 1st Edition, McGraw-Hill Education Private Limited, 2013
3. Saikat Dutt, Subramanian Chandramouli and Amit Kumar Das, “Machine Learning”, 1st Edition, Pearson Education, 2019
4. Christopher M. Bishop, “Pattern Recognition and Machine Learning”, Revised Edition, Springer, 2016.
5. Aurelien Geron, “Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow”, 2nd Edition, O‟Reilly, 2019
6. Stephen Marsland, “Machine Learning – An Algorithmic Perspective‖, Second Edition, Chapman and Hall/CRC Machine Learning and Pattern Recognition Series, 2014.