Found 188 packages in 0.02 seconds
Computation of P Values and Bayes Factors for Conditioning Data
Set of functions for the easy analyses of conditioning data.
Small Area Estimation Hierarchical Bayes For Spatial Model
Provides several functions and datasets for area level of Small Area Estimation under Spatial Model using Hierarchical Bayesian (HB) Method. Model-based estimators include the HB estimators based on a Spatial Fay-Herriot model with univariate normal distribution for variable of interest.The 'rjags' package is employed to obtain parameter estimates. For the reference, see Rao and Molina (2015)
Spike-and-Slab Variational Bayes for Linear and Logistic Regression
Implements variational Bayesian algorithms to perform scalable variable selection for sparse, high-dimensional linear and logistic regression models. Features include a novel prioritized updating scheme, which uses a preliminary estimator of the variational means during initialization to generate an updating order prioritizing large, more relevant, coefficients. Sparsity is induced via spike-and-slab priors with either Laplace or Gaussian slabs. By default, the heavier-tailed Laplace density is used. Formal derivations of the algorithms and asymptotic consistency results may be found in Kolyan Ray and Botond Szabo (JASA 2020) and Kolyan Ray, Botond Szabo, and Gabriel Clara (NeurIPS 2020).
Variational Bayes for Fast and Accurate Empirical Likelihood Inference
Computes the Gaussian variational approximation of the Bayesian
empirical likelihood posterior. This is an implementation of the function
found in Yu, W., & Bondell, H. D. (2023)
Classification and Visualization
Miscellaneous functions for classification and visualization, e.g. regularized discriminant analysis, sknn() kernel-density naive Bayes, an interface to 'svmlight' and stepclass() wrapper variable selection for supervised classification, partimat() visualization of classification rules and shardsplot() of cluster results as well as kmodes() clustering for categorical data, corclust() variable clustering, variable extraction from different variable clustering models and weight of evidence preprocessing.
Objective Bayes Intrinsic Conditional Autoregressive Model for Areal Data
Implements an objective Bayes intrinsic conditional autoregressive prior. This model provides an objective Bayesian approach for modeling spatially correlated areal data using an intrinsic conditional autoregressive prior on a vector of spatial random effects.
Empirical Bayes Variable Selection via ICM/M Algorithm
Empirical Bayes variable selection via ICM/M algorithm for normal, binary logistic, and Cox's regression. The basic problem is to fit high-dimensional regression which sparse coefficients. This package allows incorporating the Ising prior to capture structure of predictors in the modeling process. More information can be found in the papers listed in the URL below.
Set Alpha Based on Sample Size Using Bayes Factors
Sets the alpha level for coefficients in a regression model
as a decreasing function of the sample size through the use of
Jeffreys' Approximate Bayes factor. You tell alphaN() your sample
size, and it tells you to which value you must lower alpha to avoid
Lindley's Paradox. For details, see Wulff and Taylor (2023)
Derive Polygenic Risk Score Based on Emprical Bayes Theory
EB-PRS is a novel method that leverages information for effect sizes across all the markers to improve the prediction accuracy. No parameter tuning is needed in the method, and no external information is needed. This R-package provides the calculation of polygenic risk scores from the given training summary statistics and testing data. We can use EB-PRS to extract main information, estimate Empirical Bayes parameters, derive polygenic risk scores for each individual in testing data, and evaluate the PRS according to AUC and predictive r2. See Song et al. (2020)
Fitting Shared Atoms Nested Models via Variational Bayes
An efficient tool for fitting the nested common and shared atoms models using variational Bayes approximate inference for fast computation. Specifically, the package implements the common atoms model (Denti et al., 2023), its finite version (D'Angelo et al., 2023), and a hybrid finite-infinite model.
All models use Gaussian mixtures with a normal-inverse-gamma prior distribution on the parameters. Additional functions are provided to help analyze the results of the fitting procedure.
References:
Denti, Camerlenghi, Guindani, Mira (2023)