SCCI Digital Library and Forum
Menu
Home
About Us
Video Library
eBooks
SCCI Forum
Home
»
Applied Sciences
»
Computer Science
»
Electrical Engineering and Computer Science (M-I-T)
»
Introduction to Machine Learning (Fall 2020) (M-I-T)
»
Evaluating learning algorithms – validation (M-I-T)
Evaluating learning algorithms – validation (M-I-T)
Course:
Introduction to Machine Learning (Fall 2020) (M-I-T)
Discipline:
Applied Sciences
Institute:
MIT
Instructor(s):
Prof. Leslie Kaelbling
Level:
Undergraduate
Introduction to Machine Learning (Fall 2020) (M-I-T)
Back-propagation through time – backwards pass (M-I-T)
Back-propagation through time – forward pass (M-I-T)
Back-propagation through time – weight updates (M-I-T)
Building a tree – greedy algorithm (M-I-T)
Demo example – RNNs (M-I-T)
Building a tree – minimum error splits (M-I-T)
Evaluating hypotheses – training set error (M-I-T)
Building a tree – pruning (M-I-T)
Evaluating learning algorithms – validation (M-I-T)
Classification trees (M-I-T)
Evaluating predictions – loss functions (M-I-T)
Classification trees – impurity measures – entropy (M-I-T)
Example of perceptron algorithm with polynomial basis transformations (M-I-T)
Classification trees – impurity measures – gini index (M-I-T)
Feature representation – polynomial basis (M-I-T)
CNNs – a specific illustrative example filter (M-I-T)
Feature representation – transforming through-origin to not-through-origin (M-I-T)
CNNs – backprop and gradient descent (M-I-T)
Feature representation strategies for dealing with varied data (M-I-T)
CNNs – convolutional neural network layers (M-I-T)