Deriving Expectation-Maximization

 — 

Deriving the expectation-maximization algorithm, and the beginnings of its application to LDA. Once finished, its intimate connection to variational inference is apparent.


Additional Strategies for Confronting the Partition Function

 — 

Stochastic maximum likelihood, contrastive divergence, negative contrastive estimation and negative sampling for improving or avoiding the computation of the gradient of the log-partition function. (Oof, that's a mouthful.)


A Thorough Introduction to Boltzmann Machines

 — 

A pedantic walk through Boltzmann machines, with focus on the computational thorn-in-side of the partition function.


From Gaussian Algebra to Gaussian Processes, Part 2

 — 

Introducing the RBF kernel, and motivating its ubiquitous use in Gaussian processes.


From Gaussian Algebra to Gaussian Processes, Part 1

 — 

A thorough, straightforward, un-intimidating introduction to Gaussian processes in NumPy.


A Practical Guide to the Open-Source ML "Master's"

 — 

Motivation, logistics and strategic insight re: designing the Open-Source "Master's" for yourself.


Joining ASAPP

 — 

I'm joining ASAPP as a Machine Learning Engineer.


My Next Role

 — 

Beginning the search for an impossibly awesome next role.


Neurally Embedded Emojis

 — 

Convolutional variational autoencoders for emoji generation and Siamese text-question-emoji-answer models. Keras, bidirectional LSTMs and snarky tweets @united within.


Random Effects Neural Networks in Edward and Keras

 — 

Coupling nimble probabilistic models with neural architectures in Edward and Keras: "what worked and what didn't," a conceptual overview of random effects, and directions for further exploration.


© Will Wolf 2020

Powered by Pelican