MIT Introduction to Deep Learning 6.S191: Lecture 1
*New 2022 Version*
Foundations of Deep Studying
Lecturer: Alexander Amini
For all lectures, slides, and lab products: http://introtodeeplearning.com/
Lecture Outline
:00 – Introduction
6:35 – Class facts
9:51 – Why deep understanding?
12:30 – The perceptron
14:31 – Activation capabilities
17:03 – Perceptron case in point
20:25 – From perceptrons to neural networks
26:37 – Applying neural networks
29:18 – Decline capabilities
31:19 – Instruction and gradient descent
35:46 – Backpropagation
38:55 – Location the discovering fee
41:37 – Batched gradient descent
43:45 – Regularization: dropout and early halting
47:58 – Summary
Subscribe to stay up to date with new deep studying lectures at MIT, or stick to us on @MITDeepLearning on Twitter and Instagram to stay entirely-linked!!