• Home
  • Discord
  • Newsletter
  • Sponsor

Data Science at Home

2020-04

Episodes

Sunday Apr 19, 2020

Why average can get your predictions very wrong (ep. 102)

Sunday Apr 19, 2020

Whenever people reason about probability of events, they have the tendency to consider average values between two extremes. In this episode I explain why such a way of approximating is wrong and dangerous, with a numerical example. We are moving our community to Slack. See you there!    

Read more
  • Download

Wednesday Apr 01, 2020

Activate deep learning neurons faster with Dynamic RELU (ep. 101)

Wednesday Apr 01, 2020

In this episode I briefly explain the concept behind activation functions in deep learning. One of the most widely used activation function is the rectified linear unit (ReLU). While there are several flavors of ReLU in the literature, in this episode I speak about a very interesting approach that keeps computational complexity low while improving performance quite consistently. This episode is supported by pryml.io. At pryml we let companies share confidential data. Visit our website. Don't forget to join us on discord channel to propose new episode or discuss the previous ones.  References Dynamic ReLU https://arxiv.org/abs/2003.10027

Read more
  • Download

Copyright 2021 datascienceathome.com All rights reserved.

Podcast Powered By Podbean