일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | ||||||
2 | 3 | 4 | 5 | 6 | 7 | 8 |
9 | 10 | 11 | 12 | 13 | 14 | 15 |
16 | 17 | 18 | 19 | 20 | 21 | 22 |
23 | 24 | 25 | 26 | 27 | 28 |
- MIT
- CS231ntwolayerneuralnet
- ㅐㅕ세ㅕㅅ
- CNNarchitecture
- Gilbert Strang
- CS231n
- 백준알고리즘
- RegionProposalNetworks
- 아이폰원스토어
- monoculardepthestimation
- 선형대수학
- CS231nAssignment1
- pycharmerror
- BOJ
- 백준
- 맥북원스토어
- CNN구조정리
- professor strang
- Linear algebra
- 맥실리콘
- MacOS
- 선대
- ios원스토어
- gpumemory
- Algorithm
- BAEKJOON
- CS231nAssignments
- adversarialattackonmonoculardepthestimation
- CS231nSVM
- arm칩에안드로이드
- Today
- Total
목록CS231nAssignments (4)
개발로 하는 개발
Two Layer Net biological neuron vs mathematical input&output of neuron - activation function (or non-linearity) takes a single number and performs a certain fixed mathematical operation on it - fully-connected layer in which neurons between two adjacent layers are fully pairwise connected, but neurons within a single layer share no connections - Each Layer usually matrix multiplication with acti..
Softmax Another popular classifier (like SVM) Generalized version of binary Logistic Regression classifier - Softmax function : - Loss function : cross-entropy function - Numerical Stability : exponential -> very large number -> normalize values - SVM vs Softmax Same score function Wx = b, different loss function. Softmax : Calculate probabilities for each classes. Easier to interpret. Softmax.p..
KNN - space inefficient : have to remember all the data in the training set - classifying is expensive : must calculate all the distances to all of the training set -> Use SVM SVM Linear Classification - Score function, Loss function 사용 : minimize the loss function with respect to the parameters of the score function. CIFAR-10 we have a training set of N = 50,000 images, each with D = 32 x 32 x ..
- KNN ( K nearest neighbor) hyperparameter : k, L1 or L2 (distance calculating formula) Basically, you are trying to figure out which dot belongs to what region. And you are determining this by calculating the distance between the test point and train points. You get the value of k nearest points, and decide whichever the majority is. There are two ways to calculate this. L1 and L2. in the assig..