I am currently a researcher in the Machine Learning Group @Apple in Paris. Previously, I was a Tennenbaum President’s Postdoctoral Fellow in the H. Milton Stewart School of Industrial and Systems Engineering at Georgia Institute of Technology, hosted by Xiaoming Huo and Pascal Van Hentenryck . I was also a Postdoctoral Researcher at RIKEN Center for Advanced Intelligence Project in the Data-Driven Biomedical Science Team headed by Ichiro Takeuchi . I hold a PhD in Applied Mathematics. My doctoral thesis focused on the design and analysis of faster and safer optimization algorithms for variable selection and hyperparameter calibration in high dimension. I was supervised by Olivier Fercoq and Joseph Salmon at Télécom ParisTech, EDMH, Université Paris-Saclay.

Please find more details in my CV.

Email: e_ndiaye@apple.com

Thesis

  • Safe optimization algorithms for variable selection and hyperparameter tuning.
    E. Ndiaye.
    Université Paris-Saclay, October 4th, 2018.
    Manuscript , slides .

Publications

  • Exact and Approximate Conformal Inference in Multiple Dimensions.
    C. Johnstone, E. Ndiaye.
    Arxiv, 2022.
    paper , code in preparation .

  • A Confidence Machine for Sparse High-Order Interaction Model.
    D. Das, E. Ndiaye, I. Takeuchi.
    Arxiv, 2022.
    paper , code in preparation .

  • Stable Conformal Prediction Sets.
    E. Ndiaye.
    International Conference on Machine Learning, 2022.
    paper , code .

  • Continuation Path with Linear Convergence Rate.
    E. Ndiaye, I. Takeuchi.
    Arxiv, 2021.
    paper , code in preparation .

  • Root-finding Approaches for Computing Conformal Prediction Set.
    E. Ndiaye, I. Takeuchi.
    Accepted to Machine Learning , 2021.
    paper , code .

  • Screening Rules and its Complexity for Active Set Identification.
    E. Ndiaye, O. Fercoq, J. Salmon.
    Journal of Convex Analysis, 2020.
    paper , code .

  • Computing Full Conformal Prediction Set with Approximate Homotopy.
    E. Ndiaye, I. Takeuchi.
    Advances in Neural Information Processing Systems, 2019.
    paper , code .

  • Safe Grid Search with Optimal Complexity.
    E. Ndiaye, T. Le, O. Fercoq, J. Salmon, I. Takeuchi.
    International Conference on Machine Learning, 2019.
    paper , code .

  • Gap Safe screening rules for sparsity enforcing penalties.
    E. Ndiaye, O. Fercoq, A. Gramfort, J. Salmon.
    Journal of Machine Learning Research, 2017.
    paper , code .

  • Efficient Smoothed Concomitant Lasso Estimation for High Dimensional Regression.
    E. Ndiaye, O. Fercoq, V. Leclère, A. Gramfort, J. Salmon.
    Journal of Physics: Conference Series, 2017.
    paper , code .

  • GAP Safe Screening Rules for Sparse-Group-Lasso.
    E. Ndiaye, O. Fercoq, A. Gramfort, J. Salmon.
    Advances in Neural Information Processing Systems, 2016.
    paper , code .

  • GAP Safe screening rules for sparse multi-task and multi-class models.
    E. Ndiaye, O. Fercoq, A. Gramfort, J. Salmon.
    Advances in Neural Information Processing Systems, 2015.
    paper . code .

Teaching

  • Instructor for the class ISyE 6740, Computational Data Science.
    Spring 2022, Graduate level.

  • Teaching assistant at Nagoya Institute of Technology.
    Book reading in machine learning with both graduate and undergraduate students.

  • Teaching assistant at Télécom ParisTech.
    Master courses in optimization and machine learning:
    Linear Models, Clustering, Bootstrap, Ensemble Methods, First Order Optimization and Stochastic Algorithm etc.