본문 바로가기

Computer Science/CS231n

09. CNN Architecture Intro In Lecture 9 we discuss some common architectures for convolutional neural networks. We discuss architectures which performed well in the ImageNet challenges, including AlexNet, VGGNet, GoogLeNet, and ResNet, as well as other interesting models. Keywords: AlexNet, VGGNet, GoogLeNet, ResNet, Network in Network, Wide ResNet, ResNeXT, Stochastic Depth, DenseNet, FractalNet, SqueezeNet [link] .. 더보기
07. Training Neural Networks(2) CNN Preivew Lecture 7 continues our discussion of practical issues for training neural networks. We discuss different update rules commonly used to optimize neural networks during training, as well as different strategies for regularizing large neural networks including dropout. We also discuss transfer learning and finetuning. Keywords: Optimization, momentum, Nesterov momentum, AdaGrad, RMSPro.. 더보기
06. Training Neural Networks(1) In Lecture 6 we discuss many practical issues for training modern neural networks. We discuss different activation functions, the importance of data preprocessing and weight initialization, and batch normalization; we also cover some strategies for monitoring the learning process and choosing hyperparameters. Keywords: Activation functions, data preprocessing, weight initialization, batch normal.. 더보기
05. Convolutional Neural Networks In Lecture 5 we move from fully-connected neural networks to convolutional neural networks. We discuss some of the key historical milestones in the development of convolutional networks, including the perceptron, the neocognitron, LeNet, and AlexNet. We introduce convolution, pooling, and fully-connected layers which form the basis for modern convolutional networks. Keywords: Convolutional neura.. 더보기
04. Introduction to Neural Networks In Lecture 4 we progress from linear classifiers to fully-connected neural networks. We introduce the backpropagation algorithm for computing gradients and briefly discuss connections between artificial neural networks and biological neural networks. Keywords: Neural networks, computational graphs, backpropagation, activation functions, biological neurons slides Backpropagation Backprop is a rec.. 더보기
03. Loss Functions and Optimization Lecture 3 continues our discussion of linear classifiers. We introduce the idea of a loss function to quantify our unhappiness with a model’s predictions, and discuss two commonly used loss functions for image classification: the multiclass SVM loss and the multinomial logistic regression loss. We introduce the idea of regularization as a mechanism to fight overfitting, with weight decay as a co.. 더보기
02. Image Classification ![](https://images.velog.io/images/och9854/post/51c265ca-91ad-4845-86fe-b90e6237546a/image.png) 정리 카테고리 Lecture 2 formalizes the problem of image classification. We discuss the inherent difficulties of image classification, and introduce data-driven approaches. We discuss two simple data-driven image classification algorithms: K-Nearest Neighbors and Linear Classifiers, and introduce the concept.. 더보기
01. CS231n OT, Link link 더보기