Found 196 packages in 0.01 seconds
Bayes Factors for Hierarchical Linear Models with Continuous Predictors
Runs hierarchical linear Bayesian models. Samples from the posterior
distributions of model parameters in JAGS (Just Another Gibbs Sampler;
Plummer, 2017, < http://mcmc-jags.sourceforge.net>). Computes Bayes factors for group
parameters of interest with the Savage-Dickey density ratio (Wetzels,
Raaijmakers, Jakab, Wagenmakers, 2009,
Score Test Integrated with Empirical Bayes for Association Study
Perform association test within linear mixed model framework using score test integrated with Empirical Bayes for genome-wide association study. Firstly, score test was conducted for each marker under linear mixed model framework, taking into account the genetic relatedness and population structure. And then all the potentially associated markers were selected with a less stringent criterion. Finally, all the selected markers were placed into a multi-locus model to identify the true quantitative trait nucleotide.
Hierarchical Bayes Twofold Subarea Level Model SAE
We designed this package to provides several functions for area and subarea level of small area estimation under Twofold Subarea Level Model using hierarchical Bayesian (HB) method with Univariate Normal distribution for variables of interest. Some dataset simulated by a data generation are also provided. The 'rjags' package is employed to obtain parameter estimates using Gibbs Sampling algorithm. Model-based estimators involves the HB estimators which include the mean, the variation of mean, and the quantile. For the reference, see Rao and Molina (2015)
Computation of P Values and Bayes Factors for Conditioning Data
Set of functions for the easy analyses of conditioning data.
Spike-and-Slab Variational Bayes for Linear and Logistic Regression
Implements variational Bayesian algorithms to perform scalable variable selection for sparse, high-dimensional linear and logistic regression models. Features include a novel prioritized updating scheme, which uses a preliminary estimator of the variational means during initialization to generate an updating order prioritizing large, more relevant, coefficients. Sparsity is induced via spike-and-slab priors with either Laplace or Gaussian slabs. By default, the heavier-tailed Laplace density is used. Formal derivations of the algorithms and asymptotic consistency results may be found in Kolyan Ray and Botond Szabo (JASA 2020) and Kolyan Ray, Botond Szabo, and Gabriel Clara (NeurIPS 2020).
Small Area Estimation Hierarchical Bayes For Spatial Model
Provides several functions and datasets for area level of Small Area Estimation under Spatial Model using Hierarchical Bayesian (HB) Method. Model-based estimators include the HB estimators based on a Spatial Fay-Herriot model with univariate normal distribution for variable of interest.The 'rjags' package is employed to obtain parameter estimates. For the reference, see Rao and Molina (2015)
Variational Bayes for Fast and Accurate Empirical Likelihood Inference
Computes the Gaussian variational approximation of the Bayesian
empirical likelihood posterior. This is an implementation of the function
found in Yu, W., & Bondell, H. D. (2023)
Classification and Visualization
Miscellaneous functions for classification and visualization, e.g. regularized discriminant analysis, sknn() kernel-density naive Bayes, an interface to 'svmlight' and stepclass() wrapper variable selection for supervised classification, partimat() visualization of classification rules and shardsplot() of cluster results as well as kmodes() clustering for categorical data, corclust() variable clustering, variable extraction from different variable clustering models and weight of evidence preprocessing.
Objective Bayes Intrinsic Conditional Autoregressive Model for Areal Data
Implements an objective Bayes intrinsic conditional autoregressive prior. This model provides an objective Bayesian approach for modeling spatially correlated areal data using an intrinsic conditional autoregressive prior on a vector of spatial random effects.
Set Alpha Based on Sample Size Using Bayes Factors
Sets the alpha level for coefficients in a regression model
as a decreasing function of the sample size through the use of
Jeffreys' Approximate Bayes factor. You tell alphaN() your sample
size, and it tells you to which value you must lower alpha to avoid
Lindley's Paradox. For details, see Wulff and Taylor (2024)