I'm beginning to write about the intersection of artificial intelligence and geopolitics.
A detailed derivation of Mean-Field Variational Bayes, its connection to Expectation-Maximization, and its implicit motivation for the "black-box variational inference" methods born in recent years.
Deriving the expectation-maximization algorithm, and the beginnings of its application to LDA. Once finished, its intimate connection to variational inference is apparent.
Stochastic maximum likelihood, contrastive divergence, negative contrastive estimation and negative sampling for improving or avoiding the computation of the gradient of the log-partition function. (Oof, that's a mouthful.)
A pedantic walk through Boltzmann machines, with focus on the computational thorn-in-side of the partition function.
Introducing the RBF kernel, and motivating its ubiquitous use in Gaussian processes.
A thorough, straightforward, un-intimidating introduction to Gaussian processes in NumPy.
Motivation, logistics and strategic insight re: designing the Open-Source "Master's" for yourself.
© Will Wolf 2020
Powered by Pelican