29. Derivatives of Activation Functions
2023. 9. 9. 14:42ㆍGoogle ML Bootcamp/1. Neural Networks and Deep Learning
도함수 표현 방식. dg(z) / dz도 있지만 g'(z)라고도 표기한다.
1. sigmoid g(z) : 1 / (1+e**(-x))
- 도함수 g'(z) = g(z) * (1-g(z))
2. tan h g(z) : e**z - e**(-z) / e**z + e**(-z)
- 도함수 : 1 - (tan h)**2 -> 1 - a**2
3. ReLU g(z) : max(0,z)
- 도함수 : {0 if z <0 , 1 if z >= 0.}
'Google ML Bootcamp > 1. Neural Networks and Deep Learning' 카테고리의 다른 글
31. Backpropagation Intuition (Optional) (0) | 2023.09.09 |
---|---|
30. Gradient Descent for Neural Networks (0) | 2023.09.09 |
28. Why do you need Non-Linear Activation Functions? (0) | 2023.09.09 |
27. Activation Functions (0) | 2023.09.09 |
26. Explanation for Vectorized Implementation (0) | 2023.09.09 |