The Scalable Highly Adaptive Lasso

A scalable implementation of the highly adaptive lasso algorithm, including routines for constructing sparse matrices of basis functions of the observed data, as well as a custom implementation of Lasso regression tailored to enhance efficiency when the matrix of predictors is composed exclusively of indicator functions. For ease of use and increased flexibility, the Lasso fitting routines invoke code from the 'glmnet' package by default. The highly adaptive lasso was first formulated and described by MJ van der Laan (2017) , with practical demonstrations of its performance given by Benkeser and van der Laan (2016) . This implementation of the highly adaptive lasso algorithm was described by Hejazi, Coyle, and van der Laan (2020) .


Reference manual

It appears you don't have a PDF plugin for this browser. You can click here to download the reference manual.


0.4.1 by Jeremy Coyle, a month ago

Report a bug at

Browse source code at

Authors: Jeremy Coyle [aut, cre] , Nima Hejazi [aut] , Rachael Phillips [aut] , Lars van der Laan [aut] , David Benkeser [ctb] , Oleg Sofrygin [ctb] , Weixin Cai [ctb] , Mark van der Laan [aut, cph, ths]

Documentation:   PDF Manual  

GPL-3 license

Imports Matrix, stats, utils, methods, assertthat, origami, glmnet, data.table, stringr

Depends on Rcpp

Suggests testthat, knitr, rmarkdown, microbenchmark, future, ggplot2, dplyr, tidyr, survival, SuperLearner

Linking to Rcpp, RcppEigen

Imported by haldensify, txshift.

See at CRAN