• Home
  • Discord
  • Newsletter
  • Sponsor

Data Science at Home

2020-05

Episodes

Wednesday May 20, 2020

Compressing deep learning models: distillation (Ep.104)

Wednesday May 20, 2020

Using large deep learning models on limited hardware or edge devices is definitely prohibitive. There are methods to compress large models by orders of magnitude and maintain similar accuracy during inference. In this episode I explain one of the first methods: knowledge distillation  Come join us on Slack Reference Distilling the Knowledge in a Neural Network https://arxiv.org/abs/1503.02531 Knowledge Distillation and Student-Teacher Learning for Visual Intelligence: A Review and New Outlooks https://arxiv.org/abs/2004.05937

Read more
  • Download

Friday May 08, 2020

Pandemics and the risks of collecting data (Ep. 103)

Friday May 08, 2020

Codiv-19 is an emergency. True. Let's just not prepare for another emergency about privacy violation when this one is over.   Join our new Slack channel   This episode is supported by Proton. You can check them out at protonmail.com or protonvpn.com

Read more
  • Download

Copyright 2021 datascienceathome.com All rights reserved.

Podcast Powered By Podbean