# Machine Learning

Learning Outcomes:
After successful completion of the course students should be able to
• Understand a set of well-known supervised, unsupervised and semi-supervised learning algorithms
• Use a tool to implement typical clustering algorithms for different types of applications
• Identify applications suitable for different types of machine learning with suitable justification
• Implement probabilistic discriminative and generative algorithms for an application of your choice and analyze the results
• Design and implement an HMM for a sequence model type of application
• Design a neural network for an application of your choice
Syllabus:
Unit NoTopics
1

Introduction:

Machine Learning - Machine Learning Foundations –Overview – Design of a Learning system - Types of machine learning –Applications Mathematical foundations of machine learning - random variables and probabilities - Probability Theory – Probability distributions -Decision Theory- Bayes Decision Theory - Information Theory

2

Supervised Learning:

Linear Models for Regression - Linear Models for Classification – Naïve Bayes - Discriminant Functions -Probabilistic Generative Models -Probabilistic Discriminative Models - Bayesian Logistic Regression. Decision Trees - Classification Trees- egression Trees - Pruning. Neural Networks -Feed-forward Network Functions - Back- propagation. Support vector machines - Ensemble methods- Bagging- Boosting.

3

Unsupervised Learning:

Clustering- K-means - EM Algorithm- Mixtures of Gaussians-The Curse of Dimensionality - Dimensionality Reduction – Factor Analysis - Principal component Analysis -  Probabilistic PCA – Independent Analysis

4

Probabilistic Graphical Models:

Graphical Models - Undirected graphical models - Markov Random Fields - Directed Graphical Models -Bayesian Networks - Conditional independence properties - Inference – Learning-Generalization - Hidden Markov Models - Conditional random fields(CRFs)

5

Sampling –Basic sampling methods – Monte Carlo. Reinforcement Learning- K-Armed Bandit-Elements - Model-Based Learning- Value Iteration- Policy Iteration. Temporal Difference Learning-Exploration Strategies- Deterministic and Non-deterministic Rewards and Actions Computational Learning Theory - Mistake bound analysis, sample complexity analysis, VC dimension. Occam learning, accuracy and confidence boosting.

Text Books:
Name :
Pattern Recognition and Machine Learning
Author:
Christopher Bishop
Publication:
Springer, 2007.
Name :
Machine Learning: A Probabilistic Perspective
Author:
Kevin P. Murphy
Publication:
MIT Press, 2012.
Reference Books:
Name:
Introduction to Machine Learning
Author:
Ethem Alpaydin
Publication:
MIT Press
Edition:
Third Edition, 2014.
Name:
Machine Learning
Author:
Tom Mitchell
Publication:
McGraw-Hill
Edition:
1997
Name:
The Elements of Statistical Learning
Author:
Trevor Hastie, Robert Tibshirani, Jerome Friedman
Publication:
Second Edition, 2011
Name:
Machine Learning - An Algorithmic Perspective
Author:
Stephen Marsland
Publication:
Chapman and Hall/CRC Press
Edition:
Second Edition, 2014
Syllabus PDF:
AttachmentSize
184.6 KB
branch:
CBA
BDA
MA
Course:
2014
Stream:
B.Tech