Found 208 packages in 0.01 seconds
Compute FAB (Frequentist and Bayes) Conformal Prediction Intervals
Computes and plots prediction intervals for numerical
data or prediction sets for categorical data using prior information.
Empirical Bayes procedures to estimate the prior information from
multi-group data are included. See, e.g.,Bersson and Hoff (2022)
Bayes Factors for Hierarchical Linear Models with Continuous Predictors
Runs hierarchical linear Bayesian models. Samples from the posterior
distributions of model parameters in JAGS (Just Another Gibbs Sampler;
Plummer, 2017, < http://mcmc-jags.sourceforge.net>). Computes Bayes factors for group
parameters of interest with the Savage-Dickey density ratio (Wetzels,
Raaijmakers, Jakab, Wagenmakers, 2009,
Score Test Integrated with Empirical Bayes for Association Study
Perform association test within linear mixed model framework using score test integrated with Empirical Bayes for genome-wide association study. Firstly, score test was conducted for each marker under linear mixed model framework, taking into account the genetic relatedness and population structure. And then all the potentially associated markers were selected with a less stringent criterion. Finally, all the selected markers were placed into a multi-locus model to identify the true quantitative trait nucleotide.
Computation of P Values and Bayes Factors for Conditioning Data
Set of functions for the easy analyses of conditioning data.
Hierarchical Bayes Twofold Subarea Level Model SAE
We designed this package to provides several functions for area and subarea level of small area estimation under Twofold Subarea Level Model using hierarchical Bayesian (HB) method with Univariate Normal distribution for variables of interest. Some dataset simulated by a data generation are also provided. The 'rjags' package is employed to obtain parameter estimates using Gibbs Sampling algorithm. Model-based estimators involves the HB estimators which include the mean, the variation of mean, and the quantile. For the reference, see Rao and Molina (2015)
Spike-and-Slab Variational Bayes for Linear and Logistic Regression
Implements variational Bayesian algorithms to perform scalable variable selection for sparse, high-dimensional linear and logistic regression models. Features include a novel prioritized updating scheme, which uses a preliminary estimator of the variational means during initialization to generate an updating order prioritizing large, more relevant, coefficients. Sparsity is induced via spike-and-slab priors with either Laplace or Gaussian slabs. By default, the heavier-tailed Laplace density is used. Formal derivations of the algorithms and asymptotic consistency results may be found in Kolyan Ray and Botond Szabo (JASA 2020) and Kolyan Ray, Botond Szabo, and Gabriel Clara (NeurIPS 2020).
Variational Bayes for Fast and Accurate Empirical Likelihood Inference
Computes the Gaussian variational approximation of the Bayesian
empirical likelihood posterior. This is an implementation of the function
found in Yu, W., & Bondell, H. D. (2023)
Small Area Estimation Hierarchical Bayes For Spatial Model
Provides several functions and datasets for area level of Small Area Estimation under Spatial Model using Hierarchical Bayesian (HB) Method. Model-based estimators include the HB estimators based on a Spatial Fay-Herriot model with univariate normal distribution for variable of interest.The 'rjags' package is employed to obtain parameter estimates. For the reference, see Rao and Molina (2015)
Bayesian Bayes Factor Design for Two-Arm Binomial Trials
Design and analysis of two-arm binomial clinical (phase II) trials using Bayes factors. Implements Bayes factors for point-null and directional hypotheses, predictive densities under different hypotheses, and power and sample size calibration with optional frequentist type-I error and power.
Randomized Feature and Bootstrap-Enhanced Gaussian Naive Bayes Classifier
Provides an accessible and efficient implementation of a randomized
feature and bootstrap-enhanced Gaussian naive Bayes classifier. The method
combines stratified bootstrap resampling with random feature subsampling and
aggregates predictions via posterior averaging. Support is provided for
mixed-type predictors and parallel computation. Methods are described in
Srisuradetchai (2025)