You can also find my articles on my Google Scholar profile.
Decision Tree and Ensemble Learning
- M. D. Cattaneo, J. M. Klusowski, and W. Underwood, “Inference with Mondrian random forests,” Submitted, 2023 [preprint] [software]
- X. Chen, J. M. Klusowski, and Y. S. Tan, “Error reduction from stacked regressions,” Submitted, 2023 [slides] [preprint]
- M. D. Cattaneo, J. M. Klusowski, and P. M. Tian, “On the pointwise behavior of recursive partitioning and its implications for heterogeneous causal effect estimation,” Submitted, 2022 [preprint]
- M. D. Cattaneo, R. Chandak, and J. M. Klusowski, “Convergence rates of oblique regression trees for flexible function libraries,” Forthcoming in Annals of Statistics, 2023 [updated] [preprint]
- J. M. Klusowski and P. M. Tian, “Large scale prediction with decision trees,” Journal of the American Statistical Association, 2022 [journal] [preprint]
- J. M. Klusowski and P. M. Tian, “Nonparametric variable screening with optimal decision stumps,” AISTATS, 2021 [proceedings] [extended version]
- J. M. Klusowski, “Sparse learning with CART” [preprint]
- NeurIPS, 2020 [proceedings] [poster]
- Longer version with title “Sparse learning with CART for noiseless regression models,” Forthcoming in IEEE Transactions on Information Theory, 2022
- J. M. Klusowski, “Analyzing CART,” Technical report, 2019 [preprint] (Some content in this paper appeared in “Sparse learning with CART”)
- J. M. Klusowski, “Sharp analysis of a simple model for random forests,” AISTATS, 2021 [proceedings]
Gradient-based Optimization
- X. Chen and J. M. Klusowski, “Stochastic gradient descent for additive nonparametric regression,” 2023 [preprint]
- M. D. Cattaneo, J. M. Klusowski, and B. Shigida, “On the implicit bias of Adam,” Forthcoming in ICML, 2024 [preprint]
Neural Networks
- J. M. Klusowski and J. W. Siegel, “Sharp convergence rates for matching pursuit,” Revise and resubmit at IEEE Transactions on Information Theory, 2023 [preprint]
- J. M. Klusowski, “Total path variation for deep nets with general activation functions,” Technical report, 2019 [preprint]
- A. R. Barron and J. M. Klusowski, “Approximation and estimation for high-dimensional deep learning networks,” Reject and resubmit at IEEE Transactions on Information Theory, 2023 [preprint]
- J. M. Klusowski and A. R. Barron, “Approximation by combinations of ReLU and squared ReLU ridge functions with ℓ1and ℓ0 controls,” IEEE Transactions on Information Theory, 2018 [preprint] [journal]
- J. M. Klusowski and A. R. Barron, “Risk bounds for high-dimensional ridge function combinations including neural networks,” Technical report, 2018 [preprint]
- J. M. Klusowski and A. R. Barron, “Minimax lower bounds for ridge combinations including neural nets,” Proceedings IEEE International Symposium on Information Theory, Aachen, Germany, 2017 [preprint] [proceedings]
High-dimensional Linear Models
- R. Theisen, J. M. Klusowski, and M. W. Mahoney, “Good classifiers are abundant in the interpolating regime,” AISTATS, 2021 [preprint] [proceedings]
- Z. Bu, J. M. Klusowski, C. Rush, and W. J. Su, “Algorithmic analysis and statistical estimation of SLOPE via approximate message passing”
- NeurIPS, 2019 [proceedings] [poster]
- Longer version in IEEE Transactions on Information Theory, 2020 [preprint] [journal]
- Z. Bu, J. M. Klusowski, C. Rush, W. J. Su, “Characterizing the SLOPE Trade-off: A Variational Perspective and the Donoho–Tanner Limit,” Annals of Statistics, 2022 [preprint] [journal]
Transfer Learning
- J. Fan, C. Gao, and J. M. Klusowski, “Robust Transfer Learning with Unreliable Source Data,” Major revision at Annals of Statistics [preprint]
Mixture Models
- J. M. Klusowski, D. Yang, and W. D. Brinda, “Estimating the coefficients of a mixture of two linear regressions by expectation maximization,” IEEE Transactions on Information Theory, 2019 [preprint] [journal]
- J. M. Klusowski and W. D. Brinda, “Statistical guarantees for estimating the centers of a two-component Gaussian mixture by EM,” Technical report, 2016 [preprint]
Network Analysis
- J. M. Klusowski and Y. Wu, “Estimating the number of connected components in a graph via subgraph sampling,” Bernoulli, 2020 [preprint] [journal]
- J. M. Klusowski and Y. Wu, “Counting motifs with graph sampling,” Conference on Learning Theory (COLT), 2018 [preprint] [proceedings] [poster]
Shaped Constrained Estimation
- V. E. Brunel, J. M. Klusowski, and D. Yang, “Estimation of convex supports from noisy measurements,” Bernoulli, 2021 [preprint] [journal]
Miscellaneous
- W. D. Brinda, J. M. Klusowski, and D. Yang, “Hölder’s identity,” Statistics and Probability Letters, 2019 [journal]
- W. D. Brinda and J. M. Klusowski, “Finite-sample risk bounds for maximum likelihood estimation with arbitrary penalties,” IEEE Transactions on Information Theory, 2018 [preprint] [journal]