Examples: visualization, C++, networks, data cleaning, html widgets, ropensci.

Found 188 packages in 0.03 seconds

BFpack — by Joris Mulder, 3 months ago

Flexible Bayes Factor Testing of Scientific Expectations

Implementation of default Bayes factors for testing statistical hypotheses under various statistical models. The package is intended for applied quantitative researchers in the social and behavioral sciences, medical research, and related fields. The Bayes factor tests can be executed for statistical models such as univariate and multivariate normal linear models, correlation analysis, generalized linear models, special cases of linear mixed models, survival models, relational event models. Parameters that can be tested are location parameters (e.g., group means, regression coefficients), variances (e.g., group variances), and measures of association (e.g,. polychoric/polyserial/biserial/tetrachoric/product moments correlations), among others. The statistical underpinnings are described in O'Hagan (1995) , De Santis and Spezzaferri (2001) , Mulder and Xin (2022) , Mulder and Gelissen (2019) , Mulder (2016) , Mulder and Fox (2019) , Mulder and Fox (2013) , Boeing-Messing, van Assen, Hofman, Hoijtink, and Mulder (2017) , Hoijtink, Mulder, van Lissa, and Gu (2018) , Gu, Mulder, and Hoijtink (2018) , Hoijtink, Gu, and Mulder (2018) , and Hoijtink, Gu, Mulder, and Rosseel (2018) . When using the packages, please refer to the package Mulder et al. (2021) and the relevant methodological papers.

bbl — by Jun Woo, 3 years ago

Boltzmann Bayes Learner

Supervised learning using Boltzmann Bayes model inference, which extends naive Bayes model to include interactions. Enables classification of data into multiple response groups based on a large number of discrete predictors that can take factor values of heterogeneous levels. Either pseudo-likelihood or mean field inference can be used with L2 regularization, cross-validation, and prediction on new data. .

BFF — by Rachael Shudde, 9 months ago

Bayes Factor Functions

Bayes factors represent the ratio of probabilities assigned to data by competing scientific hypotheses. However, one drawback of Bayes factors is their dependence on prior specifications that define null and alternative hypotheses. Additionally, there are challenges in their computation. To address these issues, we define Bayes factor functions (BFFs) directly from common test statistics. BFFs express Bayes factors as a function of the prior densities used to define the alternative hypotheses. These prior densities are centered on standardized effects, which serve as indices for the BFF. Therefore, BFFs offer a summary of evidence in favor of alternative hypotheses that correspond to a range of scientifically interesting effect sizes. Such summaries remove the need for arbitrary thresholds to determine "statistical significance." BFFs are available in closed form and can be easily computed from z, t, chi-squared, and F statistics. They depend on hyperparameters "r" and "tau^2", which determine the shape and scale of the prior distributions defining the alternative hypotheses. Plots of BFFs versus effect size provide informative summaries of hypothesis tests that can be easily aggregated across studies.

EBrank — by John Ferguson, 8 years ago

Empirical Bayes Ranking

Empirical Bayes ranking applicable to parallel-estimation settings where the estimated parameters are asymptotically unbiased and normal, with known standard errors. A mixture normal prior for each parameter is estimated using Empirical Bayes methods, subsequentially ranks for each parameter are simulated from the resulting joint posterior over all parameters (The marginal posterior densities for each parameter are assumed independent). Finally, experiments are ordered by expected posterior rank, although computations minimizing other plausible rank-loss functions are also given.

bayesplay — by Lincoln John Colling, 2 years ago

The Bayes Factor Playground

A lightweight modelling syntax for defining likelihoods and priors and for computing Bayes factors for simple one parameter models. It includes functionality for computing and plotting priors, likelihoods, and model predictions. Additional functionality is included for computing and plotting posteriors.

KCSNBShiny — by Karne Chaithanya Sai, 6 years ago

Naive Bayes Classifier

Predicts any variable in any categorical dataset for given values of predictor variables. If a dataset contains 4 variables, then any variable can be predicted based on the values of the other three variables given by the user. The user can upload their own datasets and select what variable they want to predict. A 'handsontable' is provided to enter the predictor values and also accuracy of the prediction is also shown.

geoBayes — by Evangelos Evangelou, 7 months ago

Analysis of Geostatistical Data using Bayes and Empirical Bayes Methods

Functions to fit geostatistical data. The data can be continuous, binary or count data and the models implemented are flexible. Conjugate priors are assumed on some parameters while inference on the other parameters can be done through a full Bayesian analysis of by empirical Bayes methods.

BayesVarSel — by Gonzalo Garcia-Donato, 4 months ago

Bayes Factors, Model Choice and Variable Selection in Linear Models

Bayes factors and posterior probabilities in Linear models, aimed at provide a formal Bayesian answer to testing and variable selection problems.

BsMD — by Ernesto Barrios, 2 years ago

Bayes Screening and Model Discrimination

Bayes screening and model discrimination follow-up designs.

deconvolveR — by Balasubramanian Narasimhan, 5 years ago

Empirical Bayes Estimation Strategies

Empirical Bayes methods for learning prior distributions from data. An unknown prior distribution (g) has yielded (unobservable) parameters, each of which produces a data point from a parametric exponential family (f). The goal is to estimate the unknown prior ("g-modeling") by deconvolution and Empirical Bayes methods. Details and examples are in the paper by Narasimhan and Efron (2020, ).