17. Understanding Mini-batch Gradient Descent

2023. 9. 11. 12:30Google ML Bootcamp/2. Improving Deep Neural Networks

Batch Gradient Descent : m examples(total)

Stochastic Gradient Descent : 1 exmaples(one)

- Stochastic Gradient Descent : pick 1 examples randomly.(=확률적 경사 하강법)

 

Mini-batch Gradient Descent : (1,m) 그 사이 어딘가 적절한 크기의 examples.

- faster than Stocahstic

- low iteration than Batch

 

Tip:

1. small training set : use Batch Gradient Descent

- less than 2,000 examples

 

2. typical : mini-batch size : 64, 128, 256, 512

- depend on computer power

- make sure mini-batch fit in CPU/GPU memory