Rahul Parhi

rahul [at] ucsd [dot] edu
Assistant Professor of ECE at UCSD

rahul/rahul-epfl-bm-2023.jpg
Jacobs Hall, Room 6406
9736 Engineers Ln
La Jolla, CA 92093

About

I am an Assistant Professor of Electrical and Computer Engineering (ECE) at the University of California, San Diego (UCSD), which I joined in 2024. From 2022 to 2024, I was a Postdoctoral Researcher at the École Polytechnique Fédérale de Lausanne (EPFL), where I worked with Michael Unser. I obtained my Ph.D. in 2022 at the University of Wisconsin–Madison (UW–Madison), where I was supervised by Robert D. Nowak. There, my dissertation received the Harold Peterson Outstanding Dissertation Award. I completed my undergraduate studies at the University of Minnesota, Twin Cities (UMN) in 2018.

My research lies at the interface between functional analysis, signal processing, machine learning, nonparametric statistics, and optimization. My primary area of investigation is in the mathematics of data science with a focus on developing the foundations of neural networks and deep learning. Some questions my research aims to answer include:

  1. What is the effect of regularization in deep learning?
  2. What kinds of functions do neural networks learn?
  3. Why do neural networks seemingly break the curse of dimensionality?
  4. Why are neural networks able to adapt to low-dimensional structures?

Other topics/keywords that catch my attention include compressed sensing, computed tomography, the geometry of Banach spaces, inverse problems, minimax rates, optimal recovery, sparsity, splines, time–frequency analysis, and wavelets. For more detailed information about my research, you can take a look at my papers.


Talks

Invited Seminars and Colloquia

  • Deep Learning Meets Sparse Regularization [slides]
    • (March 2024) Massachusetts Institute of Technology (MIT), EECS Seminar
    • (February 2024) University of California, San Diego (UCSD), ECE Seminar
    • (February 2024) University of Colorado, Boulder (CU Boulder), Applied Mathematics Seminar
    • (February 2024) University of Alberta, CS Seminar
    • (January 2024) Rutgers University, Mathematics Seminar
    • (January 2024) Chinese University of Hong Kong, Shenzhen (CUHK–Shenzhen), SEE Seminar
    • (December 2023) Washington University in St. Louis, ESE + SDS Seminar
    • (December 2023) ETH Zürich, MINS Seminar
    • (November 2023) Université Catholique de Louvain (UCLouvain), Statistics Seminar
    • (September 2023) MPI MiS + UCLA, Math Machine Learning Seminar
  • Regularizing Neural Networks via Radon-Domain Total Variation
    • (November 2022) Johns Hopkins University, Mathematical Institute for Data Science (MINDS) Seminar
  • What Kinds of Functions Do Neural Networks Learn?
    • (November 2021) Working Group on Mean Field Neural Networks, Simons Institute for the Theory of Computing
  • On BV Spaces, Splines, and Neural Networks
    • (November 2021) University of Wisconsin–Madison, Analysis Seminar, Department of Mathematics
  • A Representer Theorem for Single-Hidden Layer Neural Networks
    • (July 2020) University of Wisconsin–Madison, Institute for Foundations of Data Science (IFDS) Seminar
  • Neural Networks Learn Splines
    • (October 2019) University of Wisconsin–Madison, HAMLET Seminar
  • Minimum “Norm” Neural Networks and Splines
    • (September 2019) University of Wisconsin–Madison, Institute for Foundations of Data Science (IFDS) Seminar

Invited Talks at Conference Sessions and Minisymposia

  • Deep Learning Meets Sparse Regularization
    • (November 2024) Canadian Mathematical Society (CMS) Winter Meeting, Mathematics of Machine Learning Session
  • The Role of Sparsity in Learning With Overparameterized Deep Neural Networks
    • (October 2024) SIAM Conference on Mathematics of Data Science (MDS), Learning Functions with Low-Dimensional Structure Using Neural Networks Minisymposium
  • A Banach-Space View of Neural Network Training
    • (July 2024) International Symposium on Mathematical Programming (ISMP), Nonsmooth and Hierarchical Optimization in Machine Learning Session
  • On the Sparsity-Promoting Effect of Weight Decay in Deep Learning
    • (January 2024) Conference on Parsimony and Learning (CPAL), Rising Stars Session
  • A Banach Space Representer Theorem for Single-Hidden Layer Neural Networks
    • (November 2020) SLowDNN Workshop, Young Researchers Spotlight Session

Contributed Talks

  • Modulation Spaces and the Curse of Dimensionality
    • (July 2023) International Conference on Sampling Theory and Applications (SampTA)
  • On Continuous-Domain Inverse Problems with Sparse Superpositions of Decaying Sinusoids as Solutions
    • (May 2022) IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP)

Service and Professional Activities

Journal Reviewer

  • Annals of Statistics
  • Applied and Computational Harmonic Analysis
  • Biometrika
  • IEEE Open Journal of Signal Processing
  • IEEE Signal Processing Magazine
  • Journal of Computational and Applied Mathematics
  • Journal of Machine Learning Research
  • Neural Networks
  • SIAM Journal on Mathematical Analysis
  • SIAM Journal on Mathematics of Data Science
  • SIAM Journal on Scientific Computing

Conference Reviewer

  • 2020 Conference on Machine Learning and Systems (MLSys)
  • 2023, 2024 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP)
  • 2025 International Conference on Artificial Intelligence and Statistics (AISTATS)

Session Organizer/Chair

  • International Symposium on Mathematical Programming, Montréal, Québec, Canada, 2024 (ISMP 2024)
  • International Conference on Continuous Optimization, Los Angeles, California, 2025 (ICCOPT 2025)
    • Organizer and chair for the session on relaxations of optimization problems and extreme point results in infinite dimensions

Internal Service at UCSD

  • Teaching Innovations and Undergraduate Affairs Committee, Department of Electrical and Computer Engineering (2024 – Present)

Other