Data Science at Home
Episodes
![[RB] Validate neural networks without data with Dr. Charles Martin (Ep. 74)](https://pbcdn1.podbean.com/imglogo/ep-logo/pbblog1799802/data_science_at_home_podcast_cover_300x300.png)
Tuesday Aug 27, 2019
[RB] Validate neural networks without data with Dr. Charles Martin (Ep. 74)
Tuesday Aug 27, 2019
Tuesday Aug 27, 2019
In this episode, I am with Dr. Charles Martin from Calculation Consulting a machine learning and data science consulting company based in San Francisco. We speak about the nuts and bolts of deep neural networks and some impressive findings about the way they work.
The questions that Charles answers in the show are essentially two:
Why is regularisation in deep learning seemingly quite different than regularisation in other areas on ML?
How can we dominate DNN in a theoretically principled way?
References
The WeightWatcher tool for predicting the accuracy of Deep Neural Networks https://github.com/CalculatedContent/WeightWatcher
Slack channel https://weightwatcherai.slack.com/
Dr. Charles Martin Blog http://calculatedcontent.com and channel https://www.youtube.com/c/calculationconsulting
Implicit Self-Regularization in Deep Neural Networks: Evidence from Random Matrix Theory and Implications for Learning - Charles H. Martin, Michael W. Mahoney

Wednesday Aug 14, 2019
Waterfall or Agile? The best methodology for AI and machine learning (Ep. 72)
Wednesday Aug 14, 2019
Wednesday Aug 14, 2019
The two most widely considered software development models in modern project management are, without any doubt, the Waterfall Methodology and the Agile Methodology. In this episode I make a comparison between the two and explain what I believe is the best choice for your machine learning project.
An interesting post to read (mentioned in the episode) is How businesses can scale Artificial Intelligence & Machine Learning https://amethix.com/how-businesses-can-scale-artificial-intelligence-machine-learning/

Tuesday Aug 06, 2019
Training neural networks faster without GPU (Ep. 71)
Tuesday Aug 06, 2019
Tuesday Aug 06, 2019
Training neural networks faster usually involves the usage of powerful GPUs. In this episode I explain an interesting method from a group of researchers from Google Brain, who can train neural networks faster by squeezing the hardware to their needs and making the training pipeline more dense.
Enjoy the show!
References
Faster Neural Network Training with Data Echoinghttps://arxiv.org/abs/1907.05550

Tuesday Jul 23, 2019
Validate neural networks without data with Dr. Charles Martin (Ep. 70)
Tuesday Jul 23, 2019
Tuesday Jul 23, 2019
In this episode, I am with Dr. Charles Martin from Calculation Consulting a machine learning and data science consulting company based in San Francisco. We speak about the nuts and bolts of deep neural networks and some impressive findings about the way they work.
The questions that Charles answers in the show are essentially two:
Why is regularisation in deep learning seemingly quite different than regularisation in other areas on ML?
How can we dominate DNN in a theoretically principled way?
References
The WeightWatcher tool for predicting the accuracy of Deep Neural Networks https://github.com/CalculatedContent/WeightWatcher
Slack channel https://weightwatcherai.slack.com/
Dr. Charles Martin Blog http://calculatedcontent.com and channel https://www.youtube.com/c/calculationconsulting
Implicit Self-Regularization in Deep Neural Networks: Evidence from Random Matrix Theory and Implications for Learning - Charles H. Martin, Michael W. Mahoney

Tuesday Jul 16, 2019
Complex video analysis made easy with Videoflow (Ep. 69)
Tuesday Jul 16, 2019
Tuesday Jul 16, 2019
In this episode I am with Jadiel de Armas, senior software engineer at Disney and author of Videflow, a Python framework that facilitates the quick development of complex video analysis applications and other series-processing based applications in a multiprocessing environment.
I have inspected the videoflow repo on Github and some of the capabilities of this framework and I must say that it’s really interesting. Jadiel is going to tell us a lot more than what you can read from Github
References
Videflow Github official repository https://github.com/videoflow/videoflow
![Episode 68: AI and the future of banking with Chris Skinner [RB]](https://pbcdn1.podbean.com/imglogo/ep-logo/pbblog1799802/data_science_at_home_podcast_cover_300x300.png)
Tuesday Jul 09, 2019
Episode 68: AI and the future of banking with Chris Skinner [RB]
Tuesday Jul 09, 2019
Tuesday Jul 09, 2019
In this episode I have a wonderful conversation with Chris Skinner.
Chris and I recently got in touch at The banking scene 2019, fintech conference recently held in Brussels. During that conference he talked as a real trouble maker - that’s how he defines himself - saying that “People are not educated with loans, credit, money” and that “Banks are failing at digital”.
After I got my hands on his last book Digital Human, I invited him to the show to ask him a few questions about innovation, regulation and technology in finance.

Tuesday Jul 02, 2019
Episode 67: Classic Computer Science Problems in Python
Tuesday Jul 02, 2019
Tuesday Jul 02, 2019
Today I am with David Kopec, author of Classic Computer Science Problems in Python, published by Manning Publications.
His book deepens your knowledge of problem solving techniques from the realm of computer science by challenging you with interesting and realistic scenarios, exercises, and of course algorithms. There are examples in the major topics any data scientist should be familiar with, for example search, clustering, graphs, and much more.
Get the book from https://www.manning.com/books/classic-computer-science-problems-in-python and use coupon code poddatascienceathome19 to get 40% discount.
References
Twitter https://twitter.com/davekopec
GitHub https://github.com/davecom
classicproblems.com

Tuesday Jun 25, 2019
Episode 66: More intelligent machines with self-supervised learning
Tuesday Jun 25, 2019
Tuesday Jun 25, 2019
In this episode I talk about a new paradigm of learning, which can be found a bit blurry and not really different from the other methods we know of, such as supervised and unsupervised learning. The method I introduce here is called self-supervised learning.
Enjoy the show!
Don't forget to subscribe to our Newsletter at amethix.com and get the latest updates in AI and machine learning. We do not spam. Promise!
References
Deep Clustering for Unsupervised Learning of Visual Features
Self-supervised Visual Feature Learning with Deep Neural Networks: A Survey

Sunday Jun 23, 2019
Episode 65: AI knows biology. Or does it?
Sunday Jun 23, 2019
Sunday Jun 23, 2019
The successes of deep learning for text analytics, also introduced in a recent post about sentiment analysis and published here are undeniable. Many other tasks in NLP have also benefitted from the superiority of deep learning methods over more traditional approaches. Such extraordinary results have also been possible due to the neural network approach to learn meaningful character and word embeddings, that is the representation space in which semantically similar objects are mapped to nearby vectors. All this is strictly related to a field one might initially find disconnected or off-topic: biology.
Don't forget to subscribe to our Newsletter at amethix.com and get the latest updates in AI and machine learning. We do not spam. Promise!
References
[1] Rives A., et al., “Biological structure and function emerge from scaling unsupervised learning to 250 million protein sequences”, biorxiv, doi: https://doi.org/10.1101/622803
[2] Vaswani A., et al., “Attention is all you need”, Advances in neural information processing systems, pp. 5998–6008, 2017.
[3] Bahdanau D., et al., “Neural machine translation by jointly learning to align and translate”, arXiv, http://arxiv.org/abs/1409.0473.

Friday Jun 14, 2019
Episode 64: Get the best shot at NLP sentiment analysis
Friday Jun 14, 2019
Friday Jun 14, 2019
The rapid diffusion of social media like Facebook and Twitter, and the massive use of different types of forums like Reddit, Quora, etc., is producing an impressive amount of text data every day.
There is one specific activity that many business owners have been contemplating over the last five years, that is identifying the social sentiment of their brand, by analysing the conversations of their users.
In this episode I explain how one can get the best shot at classifying sentences with deep learning and word embedding.
Additional material
Schematic representation of how to learn a word embedding matrix E by training a neural network that, given the previous M words, predicts the next word in a sentence.
Word2Vec example source code
https://gist.github.com/rlangone/ded90673f65e932fd14ae53a26e89eee#file-word2vec_example-py
References
[1] Mikolov, T. et al., "Distributed Representations of Words and Phrases and their Compositionality", Advances in Neural Information Processing Systems 26, pages 3111-3119, 2013.
[2] The Best Embedding Method for Sentiment Classification, https://medium.com/@bramblexu/blog-md-34c5d082a8c5
[3] The state of sentiment analysis: word, sub-word and character embedding https://amethix.com/state-of-sentiment-analysis-embedding/