When you're a machine learning professional you might feel like you need to learn so many algorithms that it can be hard to keep up. It can be very demotivating. This talk is not about downplaying this feeling but it is about demonstrating a lovely hack; understanding a mother algorithm.
It turns out that if you appreciate what the gaussian distribution can do then there are lot's of algorithms that are much easier to grasp. This talk is an attempt at explaining the power of the Gaussian[tm] by stepping up the ladder of complexity of algorithms:
- Naive Bayes
- Mixture Naive Bayes
- Gaussian Mixture Models
- Outlier Detectors
- Neural Mixture Models
- Gaussian Auto Embeddings
- Gaussian Processes
The talk will contain maths, but they will all be (more than) compensated with xkcd-style images. The goal is to appreciate the intuition, not the details.
Vincent is an algorithm designer, gym leader and senior person at GoDataDriven. Vincent taught himself and a few years later he co-founded PyData Amsterdam.
Thing can happen fast if you set your mind to it. He has a blog about less obvious aspects in the world of data science over at koaning.io and he’s known for giving free lectures in data science around Europe.
Vincent solves data problems. AskHimAnything[tm].