I hold a PhD in Applied Mathematics. My doctoral thesis focused on the design and analysis of faster and safer optimization algorithms for variable selection and hyperparameter calibration in high dimension. I was supervised by Olivier Fercoq and Joseph Salmon at Télécom ParisTech, EDMH, Université Paris Saclay.

Email: eugene.ndiaye (~~.) riken.jp

Thesis

  • Safe optimization algorithms for variable selection and hyperparameter tuning.
    E. Ndiaye.
    Université Paris-Saclay, October 4th, 2018.
    Manuscript , slides .

Publications

  • Root-finding Approaches for Computing Conformal Prediction Set.
    E. Ndiaye, I. Takeuchi.
    Arxiv, 2021.
    paper . code .

  • Screening Rules and its Complexity for Active Set Identification.
    E. Ndiaye, O. Fercoq, J. Salmon.
    Journal of Convex Analysis, 2020.
    paper , code .

  • Computing Full Conformal Prediction Set with Approximate Homotopy.
    E. Ndiaye, I. Takeuchi.
    Advances in Neural Information Processing Systems, 2019.
    paper , code .

  • Safe Grid Search with Optimal Complexity.
    E. Ndiaye, T. Le, O. Fercoq, J. Salmon, I. Takeuchi.
    International Conference on Machine Learning, 2019.
    paper , code .

  • Gap Safe screening rules for sparsity enforcing penalties.
    E. Ndiaye, O. Fercoq, A. Gramfort, J. Salmon.
    Journal of Machine Learning Research, 2017.
    paper , code .

  • Efficient Smoothed Concomitant Lasso Estimation for High Dimensional Regression.
    E. Ndiaye, O. Fercoq, V. Leclère, A. Gramfort, J. Salmon.
    Journal of Physics: Conference Series, 2017.
    paper , code .

  • GAP Safe Screening Rules for Sparse-Group-Lasso.
    E. Ndiaye, O. Fercoq, A. Gramfort, J. Salmon.
    Advances in Neural Information Processing Systems, 2016.
    paper , code .

  • GAP Safe screening rules for sparse multi-task and multi-class models.
    E. Ndiaye, O. Fercoq, A. Gramfort, J. Salmon.
    Advances in Neural Information Processing Systems, 2015.
    paper . code .

Teaching

  • Teaching assistant at Nagoya Institute of Technology.
    Book reading in machine learning (with both graduate and undergraduate students):
    - Computer age statistical inference (B. Efron and T. Hastie, 2016).
    - Reinforcement learning and optimal control (D. P. Bertsekas, 2019).
    - Understanding machine learning: From theory to algorithms (S. Shalev-Shwartz and S. Ben-David, 2014).
    - Statistical learning with sparsity: the lasso and generalizations (T. Hastie, R. Tibshirani, and M. Wainwright, 2015).

  • Teaching assistant at Télécom ParisTech.
    Master courses in optimization and machine learning:
    Linear Models, Clustering, Bootstrap, Ensemble Methods, First Order Optimization and Stochastic Algorithm etc.