일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | |||
5 | 6 | 7 | 8 | 9 | 10 | 11 |
12 | 13 | 14 | 15 | 16 | 17 | 18 |
19 | 20 | 21 | 22 | 23 | 24 | 25 |
26 | 27 | 28 | 29 | 30 | 31 |
- gpumemory
- CS231nAssignments
- BOJ
- BAEKJOON
- professor strang
- CS231n
- Linear algebra
- MIT
- MacOS
- 백준
- 백준알고리즘
- CNNarchitecture
- CS231ntwolayerneuralnet
- pycharmerror
- Gilbert Strang
- arm칩에안드로이드
- ios원스토어
- 맥북원스토어
- 맥실리콘
- RegionProposalNetworks
- monoculardepthestimation
- 선대
- Algorithm
- ㅐㅕ세ㅕㅅ
- CS231nSVM
- 선형대수학
- adversarialattackonmonoculardepthestimation
- CNN구조정리
- 아이폰원스토어
- CS231nAssignment1
- Today
- Total
개발로 하는 개발
[CNN] architecture 본문
CNN( = ConvNet)
- sequence of layers
- each layer of a ConvNet transforms one volue of activations to another through a differientable function
- one volume of activations = activation map = feature map
ReLU(nonlinear) layer : activates relevant responses
Fully-Connected Layer : each neuron in a layer will be connected to all the numbers in the previous volume
Pooling Layer : downsampling operation layer
Convolutional Layer : specially designed for ConvNet
* Multi-Layer perceptron : fully-connected layer(s) + ReLU(or sigmoid.. activation function)
* Multi-Layer perceptron과 CNN의 차이 : pooling layer와 convolutional layer가 추가됨
[(Conv-ReLU) * N - Pool ] * M - (FC -ReLU) * K - SoftMax
(N usually ~5, M > 10, 0 <= K <= 2)
Convolutional Layer
vs FC layer
CNN architecture
1. kernel
: filter
i - k + 1 ( i : input size, k: kernel size)
2. stride
: modify the amount of movement of filter
i - k/s + 1 (s : stride)
3. padding
i - k + 2p/s +1 (p : padding)
4. pooling
: generalizing featrues extracted by convolutional features
5. flatten
: 2D array to single long continuous linear array
Layers
1. Convolutional layer
: extract features from an image
2. Pooling layer
: decrease size of the convolved feature map
: Max pooling / Average pooling
3. Fully Connected (FC) layer
: weights and biases
: last few layers of CNN architecture
: connect neurons between different layers
4. Dropout
: mask
: nullify the contribution of some neurons towards the next layer
Activation function
: determine whether the neuron should be activated or not
: Sigmoid, tanH, Softmax, ReLU
'Study' 카테고리의 다른 글
[Linear Algebra] 02 - 11 필기한 내용 (0) | 2024.02.27 |
---|---|
[Encoder - Decoder] Architecture (0) | 2024.02.27 |
[LG Aimers] Module 6. Deep Learning (1) | 2024.01.26 |
[CS231n] Assignment 1 - Two Layer Net (0) | 2024.01.16 |
[CS231n] Assignment 1 - Softmax (0) | 2024.01.16 |