0% found this document useful (0 votes)
51 views

Linear Regression (15%)

This document outlines the topics covered in an introductory machine learning course. It includes: [1] an introduction to basic machine learning concepts like classification, regression, supervised and unsupervised learning [2] linear regression including cost functions and gradient descent [3] logistic regression including hypothesis representation, cost functions, and regularization. [4] neural networks including network representation, backpropagation, and training. [5] support vector machines including optimization objectives and kernels. [6] unsupervised learning algorithms like k-means clustering. [7] recommender systems including content-based and collaborative filtering approaches.

Uploaded by

Suneela Mathe
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views

Linear Regression (15%)

This document outlines the topics covered in an introductory machine learning course. It includes: [1] an introduction to basic machine learning concepts like classification, regression, supervised and unsupervised learning [2] linear regression including cost functions and gradient descent [3] logistic regression including hypothesis representation, cost functions, and regularization. [4] neural networks including network representation, backpropagation, and training. [5] support vector machines including optimization objectives and kernels. [6] unsupervised learning algorithms like k-means clustering. [7] recommender systems including content-based and collaborative filtering approaches.

Uploaded by

Suneela Mathe
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Course Name: Introduction to Machine Learning

Introduction [5%]

Idea of Machines learning from data, Classification of problem – Regression and Classification, Supervised
and Unsupervised learning

Linear Regression [15%]

Model representation for single variable, Single variable Cost Function, Gradient Decent for Linear
Regression, Multivariable model representation, Multivariable cost function, Gradient Decent in practice,
Normal Equation and non-invertibility

Logistic Regression [15%]

Classification, Hypothesis Representation, Decision Boundary, Cost function, Advanced Optimization, Multi-
classification (One vs All), Problem of Overfitting, Regularization

. Neural Networks [20%]

Non-linear Hypothesis, Biological Neurons, Model representation, Intuition for Neural Networks, Multiclass
classification, Cost Function, Back Propagation Algorithm, Back Propagation Intuition, Weights initialization,
Neural Network Training

Support Vector Machines [15%]

Optimization Objective, Large Margin Classifiers, Kernels, SVM practical considerations

Unsupervised learning [20%]

Unsupervised learning introduction, k-Means Algorithm, Optimization objective, Random Initialization,


Choosing number of clusters

Recommender Systems [10%]

Problem Formulation, Content based recommendations, Collaborative Filtering, Vectorization,


Implementation details.

Reference Books:

1. Machine Learning, Tom M. Mitchell

2. Building Machine Learning Systems with Python, Richert & Coelho


Introduction and Basic Concepts, Supervised Learning Setup. Linear Regression.

Linear Algebra. Weighted Least Squares. Logistic Regression. Netwon's Method


Perceptron. Exponential Family. Generalized Linear Models

Probability, Gaussian Discriminant Analysis. Naive Bayes. Laplace Smoothing.

Laplace Smoothing. Support Vector Machines.

Python and Numpy, Support Vector Machines. Kernels.

Evaluation Metrics, Bias - Variance. Regularization. Feature / Model selection,

S. Contents Hou
1N Introduction: Objective, scope and outcome of the course rs
1
2 Preliminaries, what is machine learning; varieties of machine learning, 10
learning input/output functions, bia, sample application. Boolean
functions and their classes, CNF, DNF, decision lists. Version spaces
for learning, version graphs, learning search of a version space, candidate
elimination methods
3 Neural Networks, threshold logic units, linear machines, networks of 6
threshold learning units, Training of feed forward networks by back
propagations, neural networks vs. knowledge-based systems
4 Statistical Learning, background and general method, learning belief 6
networks, nearest neighbor. Decision-trees, supervised learning of uni-
variance decision trees, network equivalent of decision trees, over
fitting and evaluation.
5 Inductive Logic Programming, notation and definitions, 5
introducing recursive programs, inductive logic programming vs
decision tree induction.

6 Computational learning theory, fundamental theorem, Vapnik- 12


Chernonenkis dimension, linear dichotomies and capacity.
Unsupervised learning, clustering methods based on Euclidian distance
and probabilities, hierarchical clustering methods. Introduction to
reinforcement and explanation-based learning.
UNIT-1

The ingredients of machine learning, Tasks: the problems that can be solved with
machine learning, Models: the output of machine learning, Features, the workhorses of
machine learning. Binary classification and related tasks: Classification, Scoring and
ranking, Class probability estimation

Download UNIT-1 Material PDF | Reference-2 | Ref-3 | Ref-4

UNIT-2

Beyond binary classification: Handling more than two classes, Regression,


Unsupervised and descriptive learning. Concept learning: The hypothesis space, Paths
through the hypothesis space, Beyond conjunctive concepts

Download UNIT-2 Material PDF | Reference-2 | Ref-3 | Ref-4

UNIT-3

Tree models: Decision trees, Ranking and probability estimation trees, Tree learning as
variance reduction. Rule models:Learning ordered rule lists, Learning unordered rule sets,
Descriptive rule learning, First-order rule learning

Download UNIT-3 Material PDF | Reference-2 | Ref-3 | Ref-4

UNIT-4:

Linear models: The least-squares method, The perceptron: a heuristic learning algorithm


for linear classifiers, Support vector machines, obtaining probabilities from linear
classifiers, Going beyond linearity with kernel methods.Distance Based Models:
Introduction, Neighbours and exemplars, Nearest Neighbours classification, Distance
Based Clustering, Hierarchical Clustering.

Download UNIT-4 Material PDF | Reference-2 | Ref-3 | Ref-4

UNIT-5

Probabilistic models: The normal distribution and its geometric interpretations,


Probabilistic models for categorical data, Discriminative learning by optimising conditional
likelihoodProbabilistic models with hidden variables.Features: Kinds of feature, Feature
transformations, Feature construction and selection. Model ensembles: Bagging and
random forests, Boosting

Download UNIT-5 Material PDF | Reference-2 | Ref-3 | Ref-4


UNIT-6

Dimensionality Reduction: Principal Component Analysis (PCA), Implementation and


demonstration. Artificial Neural Networks:Introduction, Neural network representation,
appropriate problems for neural network learning, Multilayer networks and the back
propagation algorithm.

You might also like