Halford Mackinder on Artificial Intelligence

 — 

What would preeminent 20th century geographer Halford J. Mackinder say about the coming revolution in artificial intelligence and its impact on our current ideological war?


Soft Power in the Age of Deepfakes

 — 

What happens when a post-Trump, reputationally-bruised United States, and improved generative models (the technology behind "deepfakes") collide head-on?


On Saudi Drone Strikes and Adversarial AI

 — 

In world of weaponized drones piloted by algorithms, what new strategic opportunities arise?


Artificial Intelligence and Geopolitics

 — 

I'm beginning to write about the intersection of artificial intelligence and geopolitics.


Deriving Mean-Field Variational Bayes

 — 

A detailed derivation of Mean-Field Variational Bayes, its connection to Expectation-Maximization, and its implicit motivation for the "black-box variational inference" methods born in recent years.


Deriving Expectation-Maximization

 — 

Deriving the expectation-maximization algorithm, and the beginnings of its application to LDA. Once finished, its intimate connection to variational inference is apparent.


Additional Strategies for Confronting the Partition Function

 — 

Stochastic maximum likelihood, contrastive divergence, negative contrastive estimation and negative sampling for improving or avoiding the computation of the gradient of the log-partition function. (Oof, that's a mouthful.)


A Thorough Introduction to Boltzmann Machines

 — 

A pedantic walk through Boltzmann machines, with focus on the computational thorn-in-side of the partition function.


From Gaussian Algebra to Gaussian Processes, Part 2

 — 

Introducing the RBF kernel, and motivating its ubiquitous use in Gaussian processes.


From Gaussian Algebra to Gaussian Processes, Part 1

 — 

A thorough, straightforward, un-intimidating introduction to Gaussian processes in NumPy.


© Will Wolf 2020

Powered by Pelican