Examples: visualization, C++, networks, data cleaning, html widgets, ropensci.

Found 2315 packages in 0.02 seconds

bartMachine — by Adam Kapelner, 10 months ago

Bayesian Additive Regression Trees

An advanced implementation of Bayesian Additive Regression Trees with expanded features for data analysis and visualization.

BART — by Rodney Sparapani, 17 days ago

Bayesian Additive Regression Trees

Bayesian Additive Regression Trees (BART) provide flexible nonparametric modeling of covariates for continuous, binary, categorical and time-to-event outcomes. For more information see Sparapani, Spanbauer and McCulloch .

bark — by Merlise Clyde, a year ago

Bayesian Additive Regression Kernels

Bayesian Additive Regression Kernels (BARK) provides an implementation for non-parametric function estimation using Levy Random Field priors for functions that may be represented as a sum of additive multivariate kernels. Kernels are located at every data point as in Support Vector Machines, however, coefficients may be heavily shrunk to zero under the Cauchy process prior, or even, set to zero. The number of active features is controlled by priors on precision parameters within the kernels, permitting feature selection. For more details see Ouyang, Z (2008) "Bayesian Additive Regression Kernels", Duke University. PhD dissertation, Chapter 3 and Wolpert, R. L, Clyde, M.A, and Tu, C. (2011) "Stochastic Expansions with Continuous Dictionaries Levy Adaptive Regression Kernels, Annals of Statistics Vol (39) pages 1916-1962 .

BayesTree — by Robert McCulloch, 3 months ago

Bayesian Additive Regression Trees

This is an implementation of BART:Bayesian Additive Regression Trees, by Chipman, George, McCulloch (2010).

bartcs — by Yeonghoon Yoo, 3 months ago

Bayesian Additive Regression Trees for Confounder Selection

Fit Bayesian Regression Additive Trees (BART) models to select true confounders from a large set of potential confounders and to estimate average treatment effect. For more information, see Kim et al. (2023) .

dbarts — by Vincent Dorie, 3 months ago

Discrete Bayesian Additive Regression Trees Sampler

Fits Bayesian additive regression trees (BART; Chipman, George, and McCulloch (2010) ) while allowing the updating of predictors or response so that BART can be incorporated as a conditional model in a Gibbs/Metropolis-Hastings sampler. Also serves as a drop-in replacement for package 'BayesTree'.

BartMixVs — by Chuji Luo, 2 years ago

Variable Selection Using Bayesian Additive Regression Trees

Bayesian additive regression trees (BART) provides flexible non-parametric modeling of mixed-type predictors for continuous and binary responses. This package is built upon CRAN R package 'BART', version 2.7 (< https://github.com/cran/BART>). It implements the three proposed variable selection approaches in the paper: Luo, C and Daniels, M. J. (2021), "Variable Selection Using Bayesian Additive Regression Trees." , and other three existing BART-based variable selection approaches.

bartCause — by Vincent Dorie, a year ago

Causal Inference using Bayesian Additive Regression Trees

Contains a variety of methods to generate typical causal inference estimates using Bayesian Additive Regression Trees (BART) as the underlying regression model (Hill (2012) ).

nftbart — by Rodney Sparapani, 5 months ago

Nonparametric Failure Time Bayesian Additive Regression Trees

Nonparametric Failure Time (NFT) Bayesian Additive Regression Trees (BART): Time-to-event Machine Learning with Heteroskedastic Bayesian Additive Regression Trees (HBART) and Low Information Omnibus (LIO) Dirichlet Process Mixtures (DPM). An NFT BART model is of the form Y = mu + f(x) + sd(x) E where functions f and sd have BART and HBART priors, respectively, while E is a nonparametric error distribution due to a DPM LIO prior hierarchy. See the following for a complete description of the model at .

bamlss — by Nikolaus Umlauf, a month ago

Bayesian Additive Models for Location, Scale, and Shape (and Beyond)

Infrastructure for estimating probabilistic distributional regression models in a Bayesian framework. The distribution parameters may capture location, scale, shape, etc. and every parameter may depend on complex additive terms (fixed, random, smooth, spatial, etc.) similar to a generalized additive model. The conceptual and computational framework is introduced in Umlauf, Klein, Zeileis (2019) and the R package in Umlauf, Klein, Simon, Zeileis (2021) .