A Gaussian process can be seen as a generalization of the Gaussian probability distribution in the setting of function spaces. This non-parametric Bayesian approach is very effective to model arbitrary complex functions as one does not need to specify the functional form explicitly. Instead, one controls the complexity of the model by means of a covariance function, which encodes the interaction between neighboring points. The space of covariance functions has a rich structure and it is closed under various type of operations like addition, product and convolution. Combining different type of covariant function allow us to model independent components, like the trend on seasonal components in the context of time series analysis.
The objective of this talk is to discuss the ideas and concepts around this approach through concrete examples, instead of focussing on the mathematical formalism. In particular, we show to generate predictions using the GaussianProcessRegressor of scikit-learn.
- Bayesian Linear Regression
- The Kernel Trick
- Gaussian Processes Regression
- Covariance Functions
- Gaussian Processes for Machine Learning, Carl Edward Rasmussen and Christopher K. I. Williams.
- Gaussian Processes for Timeseries Modeling, S. Roberts, M. Osborne, M. Ebden, S. Reece, N. Gibson & S. Aigrain.
- Bayesian Data Analysis, Andrew Gelman, John Carlin, Hal Stern, David Dunson, Aki Vehtari, and Donald Rubin.
Dr. Juan Orduz
I have a PhD in Mathematics from Humboldt Universität zu Berlin as a member of the Berlin Mathematical School. Currently I’m working in topics around data analysis, statistics and machine learning. I am also interested in education and knowledge sharing.