Skip to contents

First fits a predictive model of covariates on outcome \(Y\) to construct fitted values \(g(X)\). Then, it runs ML-based version of the Lin (2013) regression with predictions \(g(X)\). \(Y = \beta_0 + \beta_1 a + \beta_2 g(X) + \beta_4 (a \times (g(X) - \bar{g}(X) )\) where the treatment is interacted with predictions g(). This a dimension reduced veresion of \(X\)s instead of using the full matrix. Treatment effect is \(\beta_1\) with a heteroskedasticity robust variance estimate that is asymptotically bounded from above by the naive Neyman variance but typically lower if covariates explain substantial variation in Y.

Usage

mlRate(
  y,
  a,
  X,
  nuisMod = c("rlm", "rf"),
  glmnet_lamchoice = "lambda.min",
  glmnet_alpha = 1,
  glmnet_mu_family = "gaussian",
  glmnet_parl = FALSE,
  tuneRf = "none",
  noi = FALSE
)

Arguments

y

outcome vector

a

treatment dummy vector

X

covariate matrix

nuisMod

ML algorithm to fit nuisance function. Defaults to glmnet, can also use generalized random forests.

glmnet_lamchoice

choice of lambda (shrinkage parameter) for regularized linear regressions. Only relevant when nuisMod == "rlm"

glmnet_alpha

in [0, 1], choice of alpha in glmnet. 1 (default) corresponds with L1 regularization (LASSO) and 0 corresponds with L2 regularization (ridge), while intermediate values correspond with a mix of the two (elastic net)

glmnet_mu_family

GLM family for outcome model. Gaussian by default.

glmnet_parl

Boolean for parallelization in glmnet. Need to enable parallelized cluster beforehand.

tuneRf

boolean for whether to tune RF

noi

Boolean for noisy

Value

treatment effect and SE

References

Guo, Y., Coey, D., Konutgan, M., Li, W., Schoener, C., & Goldman, M. (2021). Machine learning for variance reduction in online experiments. Advances in Neural Information Processing Systems, 34, 8637-8648.

Lin, Winston. "Agnostic notes on regression adjustments to experimental data: Reexamining Freedman’s critique." The Annals of Applied Statistics 7.1 (2013): 295-318.