*Structured Primal Sparsity for Kernel-Based Support Vector Machines*
*(Mathematical Programming–Driven Machine Learning)*
*Basic Information*
This internship is primarily centered on *mathematical programming and
sparse optimization*, with applications to kernel-based machine learning
models. It will take place at LIPN (Laboratoire d'Informatique de
Paris-Nord), University of Paris 13. The internship duration is six months,
with flexible starting dates, and can begin immediately or as soon as the
candidate is available.
*Supervision*
The student will be supervised by Roberto Wolfre Calvo (LIPN), Diego delle
Donne (ESSEC Business School), and Emiliano Traversi (ESSEC Business
School).
*Project Overview*
Support Vector Machines (SVMs) with kernel methods are a cornerstone of
machine learning, enabling nonlinear decision boundaries through implicit
mappings to high-dimensional feature spaces. A classical example is the
polynomial kernel, which corresponds to introducing all monomials of the
input variables up to a given degree. While expressive, such models are
difficult to interpret and prone to overfitting.
>From an optimization perspective, sparsity is a natural way to control
model complexity and improve interpretability. However, in kernel-based
SVMs, sparsity has almost exclusively been studied in the *dual formulation*,
typically by limiting the number of support vectors. In contrast, enforcing
sparsity directly in the *primal feature space*—that is, selecting which
monomials or interactions are active in the model—has received very limited
attention. The main reason is that primal sparsity breaks the classical
kernel trick and leads to challenging combinatorial and continuous
optimization problems.
Recent advances in *mathematical programming for structured sparse
optimization* now make it possible to revisit this problem. The project
aims to develop *exact or provably well-founded optimization formulations
and algorithms* that combine structured sparsity constraints with SVM
training. The goal is to design kernel-based models that are interpretable,
computationally tractable, and grounded in rigorous optimization theory,
bridging machine learning and modern sparse optimization techniques.
*Candidate Profile*
We are looking for a motivated Master's student with a strong
background in *applied
mathematics, optimization, or mathematical programming*, as well as solid
programming skills (Python, C/C++, Julia, or similar). A good understanding
of optimization models and algorithms is essential. Prior exposure to
machine learning or kernel methods is a plus but not mandatory; the core
emphasis is on modeling and algorithmic aspects rather than empirical ML
alone.
*Research Environment and Perspectives*
The internship is expected to lead to a scientific publication. For a
strong and motivated candidate, the project may naturally evolve into a PhD
thesis at the interface of mathematical programming and machine learning.
*Application*
Interested candidates should send a CV and academic transcripts to:
Roberto Wolfre Calvo – wolfler@lipn.univ-paris13.fr
Diego delle Donne – delledonne@essec.edu
Emiliano Traversi – traversi@essec.edu
Please include *"SPARK-SVM"* as the subject of the email.
The internship is expected to lead to a scientific publication. For a
strong and motivated candidate, the project may naturally evolve into a PhD
thesis at the interface of mathematical programming and machine learning.
*Application*
Interested candidates should send a CV and academic transcripts to:
Roberto Wolfer Calvo – wolfler@lipn.univ-paris13.fr
Diego delle Donne – delledonne@essec.edu
Emiliano Traversi – traversi@essec.edu
Please include "SPARK-SVM" as the subject of the email.
**********************************************************
*
* Contributions to be spread via DMANET are submitted to
*
* DMANET@zpr.uni-koeln.de
*
* Replies to a message carried on DMANET should NOT be
* addressed to DMANET but to the original sender. The
* original sender, however, is invited to prepare an
* update of the replies received and to communicate it
* via DMANET.
*
* DISCRETE MATHEMATICS AND ALGORITHMS NETWORK (DMANET)
* http://www.zaik.uni-koeln.de/AFS/publications/dmanet/
*
**********************************************************