Boxiang Wang
Address: 261 Schaeffer Hall, Iowa City, IA 52242
Phone: 319-335-2294
E-mail: boxiang-wang at uiowa.edu
Publications
Tang, Q., Gu, Y., and Wang, B. (2025) "fastkqr: A fast algorithm for kernel quantile regression", Journal of Computational and Graphical Statistics, in press.
Wang, T., Yang, J., Li, Y., and Wang, B. (2025) "PIE - Partially interpretable estimatiors with refinement.", INFORMS Journal of Computing, in press.
Moon, C., Zhang, M., Wang, B., Gardner, S., Geerling, J., Hoth, K. (2025) "Multiple chronic conditions and polypharmacy in cognitively unimpaired older adults are associated with subsequent cognitive decline: results from the National Alzheimer’s Coordinating Center Data", Archives of Gerontology and Geriatrics, 134, 105846.
Tang, Q., Zhang, Y., and Wang, B. (2024) "Finite smoothing algorithm for high-dimensional support vector machines and quantile regression", International Conference on Machine Learning (ICML), 235, 47865–47884.
Wang, B., Zhou, L, Yang, J., and Mai, Q. (2024) "Density-convoluted tensor support vector machines", Statistics and Its Interface, 17(2), 231-247.
Zhou, L., Wang, B. and Zou, H. (2024) "Sparse convoluted rank regression in high dimensions", Journal of the American Statistical Association, 119(546), 1500-1512.
Wang, B., Wu, Y., and Ye, C. (2023) "The ART of transfer learning: an adaptive and robust pipeline", Stat, 12(1), e582.
Lian, Y., Yang, A.Y., Wang, B., Shi, P., and Platt, R. W. (2023) "A Tweedie compound Poisson model in reproducing kernel Hilbert space", Technometrics, 65(2), 281-295.
Wang, B., Zhou, L, Gu, Y., and Zou, H. (2023) "Density-convoluted support vector machines for high-dimensional classification", IEEE Transactions on Information Theory, 69(4), 2523-2536.
Wang, B. and Yang, A.Y. (2022) "A consolidated cross-validation algorithm for support vector machines via data reduction", Advances in Neural Information Processing Systems (NeurIPS), 35, 1-12.
Wang, B. and Zou, H. (2022) "Fast and exact cross-validation theory for large-margin classification", Technometrics, 64(3), 291-298.
Wang, B. and Zou, H. (2021) "Honest leave-one-out cross-validation for estimating post-tuning generalization error", Stat, 10(1), e413.
Hao, B., Wang, B., Zhang, J., Wang, P., Yang, J., and Sun, W.W. (2021) "Sparse tensor additive regression", Journal of Machine Learning Research, 22(64), 1-43.
Wang, B. and Zou, H. (2019) "A multicategory kernel distance weighted discrimination method for multiclass classification", Technometrics, 61(3), 396-408.
Wang, B. and Zou, H. (2018) "Another look at distance-weighted discrimination", Journal of the Royal Statistical Society, Series B, 80(1), 177-198.
Koerner, T.K., Zhang, Y., Nelson, P., Wang, B., and Zou, H. (2017) "Neural indices of phonemic discrimination and sentence-level speech intelligibility in quiet and noise: A P3 study", Hearing Research, 350, 58-67.
Wang, B. and Zou, H. (2016) "Sparse distance weighted discrimination", Journal of Computational and Graphical Statistics, 25(3), 826-838.
Koerner, T.K., Zhang, Y., Nelson, P., Wang, B., and Zou, H. (2016) "Neural indices of phonemic discrimination and sentence-level speech intelligibility in quiet and noise: A mismatch negativity study", Hearing Research, 339, 40-49.
Ning, W., Yeh, A., Wu, X., and Wang, B. (2015)
"A nonparametric phase I control chart for individual observations based on empirical likelihood ratio", Quality and Reliability Engineering International, 31(1), 37-55.
Software
R package: hdqr 
The R package implements an efficient algorithm to obtain exact solutions for penalized quantile regression models based on a finite smoothing algorithm.
R package: hdsvm 
The R package implements an efficient algorithm for sparse penalized support vector machine models using the generalized coordinate descent algorithm. Designed to handle high-dimensional datasets effectively, with emphasis on precision and computational efficiency
R package: PIE 
The R package implements a novel predictive model, Partially Interpretable Estimators (PIE), which jointly trains an interpretable model and a black-box model to achieve high predictive performance as well as partial model.
R package: fastkqr 
The R package efficiently fits and tunes kernel quantile regression models based on the majorization-minimization method. It can also fit multiple quantile curves simultaneously with a novel non-crossing penalty.
R package: dcsvm 
The R package implements an efficient algorithm for solving sparse-penalized support vector machines with kernel density convolution. This package is designed for high-dimensional classification tasks, supporting lasso (L1) and elastic-net penalties for sparse feature selection and providing options for tuning kernel bandwidth and penalty weights. The package is applicable to fields such as bioinformatics, image analysis, and text classification, where high-dimensional data commonly arise.
R package: ARTtransfer 
The R package implements a flexible framework for transfer learning that integrates information from auxiliary data sources to improve model performance on primary tasks. It is designed to be robust against negative transfer by including the non-transfer model in the candidate pool, ensuring stable performance even when auxiliary datasets are less informative.
R package: ktweedie 
The R package fits kernel-based 'Tweedie' compound Poisson gamma models using high-dimensional predictors for the analyses of zero-inflated response variables. The package features built-in estimation, prediction and cross-validation tools and supports choice of different kernel functions.
R package: kerndwd 
The R package kerndwd uses the majorization-minimization principle to solve the linear DWD. It also formulates the kernel DWD in an reproducing kernel Hilbert space and develops the same algorithm for linear DWD. The package involves very fast tuning procedures and delivers prediction accuracy that is highly comparable as the kernel SVM.
R package: sdwd 
The R package sdwd uses coordinate descent to solve sparse distance weighted discrimination for high-dimensional classification. The package computes the entire solution path for lasso, elastic net, and adpative lasso/elastic net penalites. The implementation is efficient involving computational tricks such as strong rule, warm start, and active set. The R package sdwd is extremely fast, as compared with some sparse support vector machines.
R package: CUSUMdesign 
The R package CUSUMdesign employs the Markov chain algorithm to compute the average run length and the decision interval when the CUSUM charts are designed. The CUSUM chart is widely used for detecting small but persist shifts in statistical process control.
Honors & Awards
Deans' Micro Research Grants, University of Iowa, 2020
Travel Support for Symposium on Data Science and Statistics, 2019
ICSA China Conference Junior Researcher Award, 2019
IMS New Researcher Travel Award, 2019
Student Paper Award, Section of SLDM, Joint Statistical Meetings, 2016
Doctoral Dissertation Fellowship, University of Minnesota, 2016
Google Spotlight Presentation 1st Prize, 2016
Bernard W Lindgren Teaching Award, University of Minnesota, 2016
Martin-Buehler Fellowship, University of Minnesota, 2015
Machine Learning Summer School Fellowship, 2014
Alumni Fellowship, University of Minnesota, 2014
First Year Scholarship, University of Minnesota, 2012
James A. Sullivan Outstanding Graduate Award, BGSU, 2012
Robert A. Pattern Book Scholarship, BGSU, 2011
|