Skip to main content Skip to secondary navigation
Main content start

Jose Blanchet receives INFORMS APS Best Publication in Applied Probability Award

The biannual award is given by the Applied Probability Society for an outstanding contribution to the field of applied probability.
From left: Yang Kang, Karthyek Murthy, and Jose Blanchet receive their awards at INFORMS 2023 | Photo courtesy of INFORMS

Congratulations to MS&E professor Jose Blanchet!

Professor Blanchet received the Best Publication in Applied Probability Award from the Applied Probability Society (APS) of the Institute for Operations Research and the Management Sciences (INFORMS) for 2023. Two of his papers,  co-authored with his former postdoctoral researcher Karthyek Murthy and former student Yang Kang (both at Columbia University, where professor Blanchet previously taught), received the award, which is given every two years. 

Links to the winning papers and their abstracts appear below: 

Quantifying distributional model risk via optimal transport (Blanchet and Murthy)

This paper deals with the problem of quantifying the impact of model misspecification when computing general expected values of interest. The methodology that we propose is applicable in great generality; in particular, we provide examples involving path-dependent expectations of stochastic processes. Our approach consists of computing bounds for the expectation of interest regardless of the probability measure used, as long as the measure lies within a prescribed tolerance measured in terms of a flexible class of distances from a suitable baseline model. These distances, based on optimal transportation between probability measures, include Wasserstein’s distances as particular cases. The proposed methodology is well suited for risk analysis and distributionally robust optimization, as we demonstrate with applications. We also discuss how to estimate the tolerance region nonparametrically using Skorokhod-type embeddings in some of these applications.

Robust Wasserstein profile inference and applications to machine learning (Blanchet, Kang, and Murthy)

We show that several machine learning estimators, including square-root LASSO (Least Absolute Shrinkage and Selection) and regularized logistic regression can be represented as solutions to distributionally robust optimization (DRO) problems. The associated uncertainty regions are based on suitably defined Wasserstein distances. Hence, our representations allow us to view regularization as a result of introducing an artificial adversary that perturbs the empirical distribution to account for out-of-sample effects in loss estimation. In addition, we introduce RWPI (Robust Wasserstein Profile Inference), a novel inference methodology which extends the use of methods inspired by Empirical Likelihood to the setting of optimal transport costs (of which Wasserstein distances are a particular case). We use RWPI to show how to optimally select the size of uncertainty regions, and as a consequence, we are able to choose regularization parameters for these machine learning estimators without the use of cross validation. Numerical experiments are also given to validate our theoretical findings.