SCCI Digital Library and Forum

Introduction to Machine Learning (Fall 2020) (M-I-T)

S# Lecture Course Institute Instructor Discipline
1
Back-propagation through time – backwards pass (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
2
Back-propagation through time – forward pass (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
3
Back-propagation through time – weight updates (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
4
Building a tree – greedy algorithm (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
5
Demo example – RNNs (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
6
Building a tree – minimum error splits (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
7
Evaluating hypotheses – training set error (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
8
Building a tree – pruning (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
9
Evaluating learning algorithms – validation (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
10
Classification trees (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
11
Evaluating predictions – loss functions (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
12
Classification trees – impurity measures – entropy (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
13
Example of perceptron algorithm with polynomial basis transformations (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
14
Classification trees – impurity measures – gini index (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
15
Feature representation – polynomial basis (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
16
CNNs – a specific illustrative example filter (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
17
Feature representation – transforming through-origin to not-through-origin (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
18
CNNs – backprop and gradient descent (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
19
Feature representation strategies for dealing with varied data (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
20
CNNs – convolutional neural network layers (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
21
Gradient descent optimization – algorithm in multiple dimensions (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
22
CNNs – convolutional neural networks – intro (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
23
CNNs – max pooling (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
24
Gradient descent optimization – algorithm in one dimension (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences
25
Gradient descent optimization – local optima (M-I-T)
Introduction to Machine Learning (Fall 2020) (M-I-T) MIT Prof. Leslie Kaelbling Applied Sciences