A Derivation of the Sequential Minimal Optimization Algorithm for Support Vector Machines
In my nonlinear optimization class in grad school at North Carolina State University, I wrote this paper on the famed SMO algorithm for support vector machines. In particular, I derive the Lagrangian dual of the classic formulation of the SVM optimization model and show how it can be solved using the stochastic gradient descent algorithm.
Included is a Mathematica (my language of choice at the time) implementation of the algorithm.