Found 196 packages in 0.01 seconds
Recommended Learners for 'mlr3'
Recommended Learners for 'mlr3'. Extends 'mlr3' with interfaces to essential machine learning packages on CRAN. This includes, but is not limited to: (penalized) linear and logistic regression, linear and quadratic discriminant analysis, k-nearest neighbors, naive Bayes, support vector machines, and gradient boosting.
Empirical Bayes Methods for Pharmacovigilance
A suite of empirical Bayes methods to use in pharmacovigilance. Contains various model fitting and post-processing functions. For more details see Tan et al. (2025)
Naive Bayes Transmission Analysis
Estimates the relative transmission probabilities between cases in an infectious disease outbreak or cluster using naive Bayes. Included are various functions to use these probabilities to estimate transmission parameters such as the generation/serial interval and reproductive number as well as finding the contribution of covariates to the probabilities and visualizing results. The ideal use is for an infectious disease dataset with metadata on the majority of cases but more informative data such as contact tracing or pathogen whole genome sequencing on only a subset of cases. For a detailed description of the methods see Leavitt et al. (2020)
Empirical Bayes Matrix Factorization
Methods for matrix factorization based on Wang and Stephens (2021) < https://jmlr.org/papers/v22/20-589.html>.
Geostatistical Modelling with Likelihood and Bayes
Geostatistical modelling facilities using 'SpatRaster' and 'SpatVector'
objects are provided. Non-Gaussian models are fit using 'INLA', and Gaussian
geostatistical models use Maximum Likelihood Estimation. For details see Brown (2015)
Group Sequential Bayes Design
Group Sequential Operating Characteristics for Clinical,
Bayesian two-arm Trials with known Sigma and Normal Endpoints,
as described in Gerber and Gsponer (2016)
Implementation of the Empirical Bayes Method
Implements a Bayesian-like approach to the high-dimensional sparse linear regression problem based on an empirical or data-dependent prior distribution, which can be used for estimation/inference on the model parameters, variable selection, and prediction of a future response. The method was first presented in Martin, Ryan and Mess, Raymond and Walker, Stephen G (2017)
Project Code - Nonparametric Bayes
Basic implementation of a Gibbs sampler for a Chinese Restaurant Process along with some visual aids to help understand how the sampling works. This is developed as part of a postgraduate school project for an Advanced Bayesian Nonparametric course. It is inspired by Tamara Broderick's presentation on Nonparametric Bayesian statistics given at the Simons institute.
Understand and Describe Bayesian Models and Posterior Distributions
Provides utilities to describe posterior
distributions and Bayesian models. It includes point-estimates such as
Maximum A Posteriori (MAP), measures of dispersion (Highest Density
Interval - HDI; Kruschke, 2015
Bayesian Network Structure Learning, Parameter Learning and Inference
Bayesian network structure learning, parameter learning and inference. This package implements constraint-based (PC, GS, IAMB, Inter-IAMB, Fast-IAMB, MMPC, Hiton-PC, HPC), pairwise (ARACNE and Chow-Liu), score-based (Hill-Climbing and Tabu Search) and hybrid (MMHC, RSMAX2, H2PC) structure learning algorithms for discrete, Gaussian and conditional Gaussian networks, along with many score functions and conditional independence tests. The Naive Bayes and the Tree-Augmented Naive Bayes (TAN) classifiers are also implemented. Some utility functions (model comparison and manipulation, random data generation, arc orientation testing, simple and advanced plots) are included, as well as support for parameter estimation (maximum likelihood and Bayesian) and inference, conditional probability queries, cross-validation, bootstrap and model averaging. Development snapshots with the latest bugfixes are available from < https://www.bnlearn.com/>.