What's the right batch size in deep learning training?
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-23-2021 07:57 AM
Using Ganglia you can monitor how busy is the GPU(s). Increasing the batch size would increase that utilization. Bigger batches improve how well each batch updates the model (up to a point) with more accurate gradients. That in turn can allow training to use a higher learning rate, and more quickly reach the point where the model stops improving.
https://databricks.com/blog/2019/08/15/how-not-to-scale-deep-learning-in-6-easy-steps.html
Labels:
- Labels:
-
Best practice
-
Deep learning
-
Gpu
0 REPLIES 0
![](/skins/images/B38AF44D4BD6CE643D2A527BE673CCF6/responsive_peak/images/icon_anonymous_message.png)
![](/skins/images/B38AF44D4BD6CE643D2A527BE673CCF6/responsive_peak/images/icon_anonymous_message.png)