Data Science at Home
Episodes

Tuesday Jul 16, 2019
Complex video analysis made easy with Videoflow (Ep. 69)
Tuesday Jul 16, 2019
Tuesday Jul 16, 2019
In this episode I am with Jadiel de Armas, senior software engineer at Disney and author of Videflow, a Python framework that facilitates the quick development of complex video analysis applications and other series-processing based applications in a multiprocessing environment.
I have inspected the videoflow repo on Github and some of the capabilities of this framework and I must say that it’s really interesting. Jadiel is going to tell us a lot more than what you can read from Github
References
Videflow Github official repository https://github.com/videoflow/videoflow

Tuesday Jul 02, 2019
Episode 67: Classic Computer Science Problems in Python
Tuesday Jul 02, 2019
Tuesday Jul 02, 2019
Today I am with David Kopec, author of Classic Computer Science Problems in Python, published by Manning Publications.
His book deepens your knowledge of problem solving techniques from the realm of computer science by challenging you with interesting and realistic scenarios, exercises, and of course algorithms. There are examples in the major topics any data scientist should be familiar with, for example search, clustering, graphs, and much more.
Get the book from https://www.manning.com/books/classic-computer-science-problems-in-python and use coupon code poddatascienceathome19 to get 40% discount.
References
Twitter https://twitter.com/davekopec
GitHub https://github.com/davecom
classicproblems.com

Tuesday Jun 25, 2019
Episode 66: More intelligent machines with self-supervised learning
Tuesday Jun 25, 2019
Tuesday Jun 25, 2019
In this episode I talk about a new paradigm of learning, which can be found a bit blurry and not really different from the other methods we know of, such as supervised and unsupervised learning. The method I introduce here is called self-supervised learning.
Enjoy the show!
Don't forget to subscribe to our Newsletter at amethix.com and get the latest updates in AI and machine learning. We do not spam. Promise!
References
Deep Clustering for Unsupervised Learning of Visual Features
Self-supervised Visual Feature Learning with Deep Neural Networks: A Survey

Sunday Jun 23, 2019
Episode 65: AI knows biology. Or does it?
Sunday Jun 23, 2019
Sunday Jun 23, 2019
The successes of deep learning for text analytics, also introduced in a recent post about sentiment analysis and published here are undeniable. Many other tasks in NLP have also benefitted from the superiority of deep learning methods over more traditional approaches. Such extraordinary results have also been possible due to the neural network approach to learn meaningful character and word embeddings, that is the representation space in which semantically similar objects are mapped to nearby vectors. All this is strictly related to a field one might initially find disconnected or off-topic: biology.
Don't forget to subscribe to our Newsletter at amethix.com and get the latest updates in AI and machine learning. We do not spam. Promise!
References
[1] Rives A., et al., “Biological structure and function emerge from scaling unsupervised learning to 250 million protein sequences”, biorxiv, doi: https://doi.org/10.1101/622803
[2] Vaswani A., et al., “Attention is all you need”, Advances in neural information processing systems, pp. 5998–6008, 2017.
[3] Bahdanau D., et al., “Neural machine translation by jointly learning to align and translate”, arXiv, http://arxiv.org/abs/1409.0473.

Tuesday Jun 04, 2019
Episode 63: Financial time series and machine learning
Tuesday Jun 04, 2019
Tuesday Jun 04, 2019
In this episode I speak to Alexandr Honchar, data scientist and owner of blog https://medium.com/@alexrachnogAlexandr has written very interesting posts about time series analysis for financial data. His blog is in my personal list of best tutorial blogs. We discuss about financial time series and machine learning, what makes predicting the price of stocks a very challenging task and why machine learning might not be enough.As usual, I ask Alexandr how he sees machine learning in the next 10 years. His answer - in my opinion quite futuristic - makes perfect sense.
You can contact Alexandr on
Twitter https://twitter.com/AlexRachnog
Facebook https://www.facebook.com/rachnog
Medium https://medium.com/@alexrachnog
Enjoy the show!

Tuesday May 28, 2019
Episode 62: AI and the future of banking with Chris Skinner
Tuesday May 28, 2019
Tuesday May 28, 2019
In this episode I have a wonderful conversation with Chris Skinner.
Chris and I recently got in touch at The banking scene 2019, fintech conference recently held in Brussels. During that conference he talked as a real trouble maker - that’s how he defines himself - saying that “People are not educated with loans, credit, money” and that “Banks are failing at digital”.
After I got my hands on his last book Digital Human, I invited him to the show to ask him a few questions about innovation, regulation and technology in finance.

Tuesday May 21, 2019
Episode 61: The 4 best use cases of entropy in machine learning
Tuesday May 21, 2019
Tuesday May 21, 2019
It all starts from physics. The entropy of an isolated system never decreases… Everyone at school, at some point of his life, learned this in his physics class. What does this have to do with machine learning? To find out, listen to the show.
References
Entropy in machine learning https://amethix.com/entropy-in-machine-learning/

Thursday May 16, 2019
Episode 60: Predicting your mouse click (and a crash course in deeplearning)
Thursday May 16, 2019
Thursday May 16, 2019
Deep learning is the future. Get a crash course on deep learning. Now! In this episode I speak to Oliver Zeigermann, author of Deep Learning Crash Course published by Manning Publications at https://www.manning.com/livevideo/deep-learning-crash-course
Oliver (Twitter: @DJCordhose) is a veteran of neural networks and machine learning. In addition to the course - that teaches you concepts from prototype to production - he's working on a really cool project that predicts something people do every day... clicking their mouse.
If you use promo code poddatascienceathome19 you get a 40% discount for all products on the Manning platform
Enjoy the show!
References:
Deep Learning Crash Course (Manning Publications)
https://www.manning.com/livevideo/deep-learning-crash-course?a_aid=djcordhose&a_bid=e8e77cbf
Companion notebooks for the code samples of the video course "Deep Learning Crash Course"
https://github.com/DJCordhose/deep-learning-crash-course-notebooks/blob/master/README.md
Next-button-to-click predictor source code
https://github.com/DJCordhose/ux-by-tfjs

Tuesday May 07, 2019
Episode 59: How to fool a smart camera with deep learning
Tuesday May 07, 2019
Tuesday May 07, 2019
In this episode I met three crazy researchers from KULeuven (Belgium) who found a method to fool surveillance cameras and stay hidden just by holding a special t-shirt. We discussed about the technique they used and some consequences of their findings.
They published their paper on Arxiv and made their source code available at https://gitlab.com/EAVISE/adversarial-yolo
Enjoy the show!
References
Fooling automated surveillance cameras: adversarial patches to attack person detection Simen Thys, Wiebe Van Ranst, Toon Goedemé
Eavise Research Group KULeuven (Belgium)https://iiw.kuleuven.be/onderzoek/eavise

Tuesday Apr 30, 2019
Episode 58: There is physics in deep learning!
Tuesday Apr 30, 2019
Tuesday Apr 30, 2019
There is a connection between gradient descent based optimizers and the dynamics of damped harmonic oscillators. What does that mean? We now have a better theory for optimization algorithms.In this episode I explain how all this works.
All the formulas I mention in the episode can be found in the post The physics of optimization algorithms
Enjoy the show.