Bleakley, K., Giudicelli, V., Wu, Y., Lefranc, M.-P. and Biau, G. (2006).
IMGT standardization for statistical analyses of T cell receptor junctions: The TRAV-TRAJ example,
In Silico Biology,
Vol. 6, pp. 573-588.
Biau, G., Chazal, F., Cohen-Steiner, D., Devroye, L. and Rodríguez, C. (2011).
A weighted k-nearest neighbor density estimate for geometric inference,
Electronic Journal of Statistics, Vol. 5, pp. 204-237.
Kruppa, J., Liu, Y., Biau, G., Kohler, M., König, I.R., Malley, J.D. and Ziegler, A. (2014).
Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory,
Biometrical Journal, Vol. 56, pp. 534-563.
Biau, G. and Mason, D.M. (2015).
High-dimensional p-norms,
in
Mathematical Statistics and Limit Theorems: Festschrift in Honour of Paul Deheuvels, ed. Hallin, M., Mason, D.M., Pfeifer, D. and Steinebach, J.G., pp. 21-40,
Springer, Cham.
Tanielian, U., Sangnier, M. and Biau, G. (2021).
Approximating Lipschitz continuous functions with GroupSort neural networks,
in
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, ed. Banerjee, A. and Fukumizu, K., Proceedings of Machine Learning Research,
Vol. 130, pp. 442-450, PMLR.
Bénard, C., Biau, G., Da Veiga, S. and Scornet, E. (2021).
Interpretable random forests via rule extraction,
in
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, ed. Banerjee, A. and Fukumizu, K., Proceedings of Machine Learning Research,
Vol. 130, pp. 937-945, PMLR.
Du, Q., Biau, G., Petit, F. and Porcher, R. (2021).
Wasserstein random forests and applications in heterogeneous treatment effects,
in
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, ed. Banerjee, A. and Fukumizu, K., Proceedings of Machine Learning Research,
Vol. 130, pp. 1729-1737, PMLR.
Biau, G. and Cadre, B. (2021).
Optimization by gradient boosting,
in Advances in Contemporary Statistics and Econometrics: Festschrift in Honor of Christine Thomas-Agnan, ed. Daouia, A. and Ruiz-Gazen, A., pp. 23-44,
Springer, Cham.
Fermanian, A., Marion, P., Vert, J.-P. and Biau, G. (2021).
Framing RNN as a kernel method: A neural ODE approach,
in
Advances in Neural Information Processing Systems, ed. Ranzato, M., Beygelzimer, A., Dauphin, Y., Liang, P.S. and Wortman Vaughan, J.,
Vol. 34, pp. 3121-3134, Curran Associates, Inc.
Bénard, C., Biau, G., Da Veiga, S. and Scornet, E. (2022).
SHAFF: Fast and consistent SHApley eFfect estimates via random Forests,
in
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, ed. Camps-Valls, G., Ruiz, F.J.R. and Valera, I., Proceedings of Machine Learning Research,
Vol. 151, pp. 5563-5582, PMLR.
Doumèche, N., Bach, F., Biau, G. and Boyer, C. (2024).
Physics-informed machine learning as a kernel method,
in
Proceedings of Thirty Seventh Conference on Learning Theory, ed. Agrawal, S. and Roth, A., Proceedings of Machine Learning Research,
Vol. 247, pp. 1399-1450, PMLR.
Stéphanovitch, A., Tanielian, U., Cadre, B., Klutchnikoff, N. and Biau, G. (2024).
Optimal 1-Wasserstein distance for WGANs,
Bernoulli, Vol. 30, pp. 2955-2978.
Fermanian, A., Chang, J., Lyons, T. and Biau, G. (2024).
The insertion method to invert the signature of a path,
in
Recent Advances in Econometrics and Statistics: Festschrift in Honour of Marc Hallin, ed. Barigozzi, M., Hörmann, S. and Paindaveine, D., pp. 575-595,
Springer, Cham.
Wu, Y.-H., Marion, P., Biau, G. and Boyer, C. (2025).
Taking a big step: Large learning rates in denoising score matching prevent memorization,
in
Proceedings of Thirty Eighth Conference on Learning Theory, ed. Haghtalab, N. and Moitra, A., Proceedings of Machine Learning Research,
Vol. 291, pp. 5718-5756, PMLR.
Doumèche, N., Bach, F., Biau, G. and Boyer, C. (2026).
Fast kernel methods: Sobolev, physics-informed, and additive models,
ICML 2026, in press.
Wu, Y.-H., Berthet, Q., Biau, G., Boyer, C., Elie, R. and Marion, P. (2026).
Optimal stopping in latent diffusion models,
ICML 2026, in press.