Data Science at Home
Episodes
Tuesday Aug 06, 2019
Training neural networks faster without GPU (Ep. 71)
Tuesday Aug 06, 2019
Tuesday Aug 06, 2019
Training neural networks faster usually involves the usage of powerful GPUs. In this episode I explain an interesting method from a group of researchers from Google Brain, who can train neural networks faster by squeezing the hardware to their needs and making the training pipeline more dense.
Enjoy the show!
References
Faster Neural Network Training with Data Echoinghttps://arxiv.org/abs/1907.05550
Tuesday Jul 23, 2019
Validate neural networks without data with Dr. Charles Martin (Ep. 70)
Tuesday Jul 23, 2019
Tuesday Jul 23, 2019
In this episode, I am with Dr. Charles Martin from Calculation Consulting a machine learning and data science consulting company based in San Francisco. We speak about the nuts and bolts of deep neural networks and some impressive findings about the way they work.
The questions that Charles answers in the show are essentially two:
Why is regularisation in deep learning seemingly quite different than regularisation in other areas on ML?
How can we dominate DNN in a theoretically principled way?
References
The WeightWatcher tool for predicting the accuracy of Deep Neural Networks https://github.com/CalculatedContent/WeightWatcher
Slack channel https://weightwatcherai.slack.com/
Dr. Charles Martin Blog http://calculatedcontent.com and channel https://www.youtube.com/c/calculationconsulting
Implicit Self-Regularization in Deep Neural Networks: Evidence from Random Matrix Theory and Implications for Learning - Charles H. Martin, Michael W. Mahoney
Tuesday Jul 02, 2019
Episode 67: Classic Computer Science Problems in Python
Tuesday Jul 02, 2019
Tuesday Jul 02, 2019
Today I am with David Kopec, author of Classic Computer Science Problems in Python, published by Manning Publications.
His book deepens your knowledge of problem solving techniques from the realm of computer science by challenging you with interesting and realistic scenarios, exercises, and of course algorithms. There are examples in the major topics any data scientist should be familiar with, for example search, clustering, graphs, and much more.
Get the book from https://www.manning.com/books/classic-computer-science-problems-in-python and use coupon code poddatascienceathome19 to get 40% discount.
References
Twitter https://twitter.com/davekopec
GitHub https://github.com/davecom
classicproblems.com
Tuesday May 21, 2019
Episode 61: The 4 best use cases of entropy in machine learning
Tuesday May 21, 2019
Tuesday May 21, 2019
It all starts from physics. The entropy of an isolated system never decreases… Everyone at school, at some point of his life, learned this in his physics class. What does this have to do with machine learning? To find out, listen to the show.
References
Entropy in machine learning https://amethix.com/entropy-in-machine-learning/
Tuesday May 07, 2019
Episode 59: How to fool a smart camera with deep learning
Tuesday May 07, 2019
Tuesday May 07, 2019
In this episode I met three crazy researchers from KULeuven (Belgium) who found a method to fool surveillance cameras and stay hidden just by holding a special t-shirt. We discussed about the technique they used and some consequences of their findings.
They published their paper on Arxiv and made their source code available at https://gitlab.com/EAVISE/adversarial-yolo
Enjoy the show!
References
Fooling automated surveillance cameras: adversarial patches to attack person detection Simen Thys, Wiebe Van Ranst, Toon Goedemé
Eavise Research Group KULeuven (Belgium)https://iiw.kuleuven.be/onderzoek/eavise
Tuesday Apr 30, 2019
Episode 58: There is physics in deep learning!
Tuesday Apr 30, 2019
Tuesday Apr 30, 2019
There is a connection between gradient descent based optimizers and the dynamics of damped harmonic oscillators. What does that mean? We now have a better theory for optimization algorithms.In this episode I explain how all this works.
All the formulas I mention in the episode can be found in the post The physics of optimization algorithms
Enjoy the show.
Tuesday Apr 23, 2019
Episode 57: Neural networks with infinite layers
Tuesday Apr 23, 2019
Tuesday Apr 23, 2019
How are differential equations related to neural networks? What are the benefits of re-thinking neural network as a differential equation engine? In this episode we explain all this and we provide some material that is worth learning. Enjoy the show!
Residual Block
References
[1] K. He, et al., “Deep Residual Learning for Image Recognition”, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pages 770-778, 2016
[2] S. Hochreiter, et al., “Long short-term memory”, Neural Computation 9(8), pages 1735-1780, 1997.
[3] Q. Liao, et al.,”Bridging the gaps between residual learning, recurrent neural networks and visual cortex”, arXiv preprint, arXiv:1604.03640, 2016.
[4] Y. Lu, et al., “Beyond Finite Layer Neural Networks: Bridging Deep Architectures and Numerical Differential Equation”, Proceedings of the 35th International Conference on Machine Learning (ICML), Stockholm, Sweden, 2018.
[5] T. Q. Chen, et al., ” Neural Ordinary Differential Equations”, Advances in Neural Information Processing Systems 31, pages 6571-6583}, 2018
Tuesday Sep 04, 2018
Episode 46: why do machine learning models fail? (Part 2)
Tuesday Sep 04, 2018
Tuesday Sep 04, 2018
In this episode I continue the conversation from the previous one, about failing machine learning models.
When data scientists have access to the distributions of training and testing datasets it becomes relatively easy to assess if a model will perform equally on both datasets. What happens with private datasets, where no access to the data can be granted?
At fitchain we might have an answer to this fundamental problem.
Thursday Jul 19, 2018
Episode 39: What is L1-norm and L2-norm?
Thursday Jul 19, 2018
Thursday Jul 19, 2018
In this episode I explain the differences between L1 and L2 regularization that you can find in function minimization in basically any machine learning model.
Tuesday Nov 21, 2017
Episode 30: Neural networks and genetic evolution: an unfeasible approach
Tuesday Nov 21, 2017
Tuesday Nov 21, 2017
Despite what researchers claim about genetic evolution, in this episode we give a realistic view of the field.