728x90
In Lecture 4 we progress from linear classifiers to fully-connected neural networks
. We introduce the backpropagation algorithm
for computing gradients and briefly discuss connections
between artificial neural networks and biological neural networks.
Keywords: Neural networks, computational graphs, backpropagation, activation functions, biological neurons
Backpropagation
Backprop is a recursive appllication of chain rule. Let's see an example.
Patterns in backward flow
Gradients for vectorized code
Vectorized operations
To be added: what is Jacobian
Modularized implementation
Summary
728x90
'Computer Science > CS231n' 카테고리의 다른 글
06. Training Neural Networks(1) (0) | 2022.02.22 |
---|---|
05. Convolutional Neural Networks (0) | 2022.02.22 |
03. Loss Functions and Optimization (0) | 2022.02.22 |
02. Image Classification (0) | 2022.02.22 |
01. CS231n OT, Link (0) | 2022.02.22 |