Statistical underpinnings of the machine learning models we know and love. A walk through random variables, entropy, exponential family distributions, generalized linear models, maximum likelihood estimation, cross entropy, KL-divergence, maximum a posteriori estimation and going "fully Bayesian."
Autoencoding airports via variational autoencoders to improve flight delay prediction. Additionally, a principled look at variational inference itself and its connections to machine learning.
Deriving the softmax from first conditional probabilistic principles, and how this framework extends naturally to define the softmax regression, conditional random fields, naive Bayes and hidden Markov models.
In this post, we look to beat the performance of Implicit Matrix Factorization on a recommendation task using 5 different neural network architectures.
A follow-up to Erik Bernhardsson's post "More MCMC – Analyzing a small dataset with 1-5 ratings" using ordered categorical generalized linear models.
Simple intercausal reasoning on a 3-node Bayesian network.
A toy, hand-rolled Bayesian model, optimized via simulated annealing.
Modeling a typical week of RescueTime data via an alternative take on the Dirichlet distribution.
Hand-rolled sparse autoencoders to generate novel world flags.
An introduction to what Docker is and why and how to use it for Kaggle.