• Home
  • Discord
  • Newsletter
  • Sponsor
  1. All Episodes
Make Stochastic Gradient Descent Fast Again (Ep. 113)

Wednesday Jul 22, 2020

Make Stochastic Gradient Descent Fast Again (Ep. 113)
  • Download

There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.

Join our Discord channel and chat with us.

 

References

  • More descent, less gradient
  • Taylor Series

 

Comments (0)

To leave or reply to comments, please download free Podbean or

No Comments

To leave or reply to comments,
please download free Podbean App.

iOS appAndroid app

Copyright 2021 datascienceathome.com All rights reserved.

Podcast Powered By Podbean