• Home
  • Discord
  • Newsletter
  • Sponsor
  1. All Episodes
Activate deep learning neurons faster with Dynamic RELU (ep. 101)

Wednesday Apr 01, 2020

Activate deep learning neurons faster with Dynamic RELU (ep. 101)
  • Download

In this episode I briefly explain the concept behind activation functions in deep learning. One of the most widely used activation function is the rectified linear unit (ReLU). 
While there are several flavors of ReLU in the literature, in this episode I speak about a very interesting approach that keeps computational complexity low while improving performance quite consistently.

This episode is supported by pryml.io. At pryml we let companies share confidential data. Visit our website.

Don't forget to join us on discord channel to propose new episode or discuss the previous ones. 

References

Dynamic ReLU https://arxiv.org/abs/2003.10027

Comments (0)

To leave or reply to comments, please download free Podbean or

No Comments

To leave or reply to comments,
please download free Podbean App.

iOS appAndroid app

Copyright 2021 datascienceathome.com All rights reserved.

Podcast Powered By Podbean