- Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization (Jaggi, 2013)
- On the Global Linear Convergence of Frank-Wolfe Optimization Variants (Lacoste-Julien, Jaggi, 2015)
- An Equivalence between the Lasso and Support Vector Machines (Jaggi, 2014)
In this notebook we will implement:
- Frank-Wolfe Algorithm
- Away-Step Frank-Wolfe
- Pairwise Frank-Wolfe
- Fully Corrective Frank Wolfe
to train soft-margin SVM on the following datasets:
At the end we will compare the perfomace of the algorithms with each other.
To train soft marging SVM we will formulate the optimization as follows:
This is known as the Primal problem.
We can transform the primal problem into a dual one by constructing the lagrangian and applying the KKT conditions. After doing so we find:
Lagrangian of the Dual Problem
Which is equivalent to:
We will use Frank-Wolfe and variations of Frank-Wolfe to solve this minimization problem.
- Gita Rayung Andadari [email protected]
- Endrit Sveçla [email protected]
- Santiago Viquez [email protected]