Colloquium - Xiaowu Dai - September 18, 2025
views
comments
Speaker: Xiaowu Dai, UCLA
Date/Time: Thursday, September18, 2025, 10:00 AM - 11:00 AM ET
Title: Statistical Learning via Partial Derivatives
Abstract: Traditional nonparametric estimation methods often lead to a slow convergence rate in large dimensions and require unrealistically large dataset sizes for reliable conclusions. We develop an approach based on partial derivatives, either observed or estimated, to effectively estimate the function at near-parametric convergence rates. This novel approach and computational algorithm could lead to methods useful to practitioners in many areas of science and engineering. Our theoretical results reveal behavior universal to this class of nonparametric estimation problems. We explore a general setting involving tensor product spaces and build upon the smoothing spline analysis of variance framework. For d-dimensional models under full interaction, the optimal rates with gradient information on p covariates are identical to those for the (d-p)-interaction models without gradients and, therefore, the models are immune to the curse of interaction. For additive models, the optimal rates using gradient information achieve the surprising parametric rate.
Bio: Xiaowu Dai is an Assistant Professor of Statistics and Data Science, and of Biostatistics, at UCLA. Previously, he was a postdoc at UC Berkeley from 2019-2022, advised by Michael I. Jordan. Before that, he received a Ph.D. in Statistics from UW-Madison, advised by Grace Wahba. His research is focused on developing statistical theory and methodology to address real-world problems that involve computational, inferential, and economic considerations.
Website: https://www.xiaowudai.org/