Examples: visualization, C++, networks, data cleaning, html widgets, ropensci.

Found 148 packages in 0.01 seconds

BayesFactor — by Richard D. Morey, 3 years ago

Computation of Bayes Factors for Common Designs

A suite of functions for computing various Bayes factors for simple designs, including contingency tables, one- and two-sample designs, one-way designs, general ANOVA designs, and linear regression.

e1071 — by David Meyer, 8 days ago

Misc Functions of the Department of Statistics, Probability Theory Group (Formerly: E1071), TU Wien

Functions for latent class analysis, short time Fourier transform, fuzzy clustering, support vector machines, shortest path computation, bagged clustering, naive Bayes classifier, generalized k-nearest neighbour ...

bridgesampling — by Quentin F. Gronau, 5 months ago

Bridge Sampling for Marginal Likelihoods and Bayes Factors

Provides functions for estimating marginal likelihoods, Bayes factors, posterior model probabilities, and normalizing constants in general, via different versions of bridge sampling (Meng & Wong, 1996, < http://www3.stat.sinica.edu.tw/statistica/j6n4/j6n43/j6n43.htm>). Gronau, Singmann, & Wagenmakers (2020) .

EbayesThresh — by Peter Carbonetto, 4 years ago

Empirical Bayes Thresholding and Related Methods

Empirical Bayes thresholding using the methods developed by I. M. Johnstone and B. W. Silverman. The basic problem is to estimate a mean vector given a vector of observations of the mean vector plus white noise, taking advantage of possible sparsity in the mean vector. Within a Bayesian formulation, the elements of the mean vector are modelled as having, independently, a distribution that is a mixture of an atom of probability at zero and a suitable heavy-tailed distribution. The mixing parameter can be estimated by a marginal maximum likelihood approach. This leads to an adaptive thresholding approach on the original data. Extensions of the basic method, in particular to wavelet thresholding, are also implemented within the package.

REBayes — by Roger Koenker, 2 years ago

Empirical Bayes Estimation and Inference

Kiefer-Wolfowitz maximum likelihood estimation for mixture models and some other density estimation and regression methods based on convex optimization. See Koenker and Gu (2017) REBayes: An R Package for Empirical Bayes Mixture Methods, Journal of Statistical Software, 82, 1--26, .

naivebayes — by Michal Majka, 2 years ago

High Performance Implementation of the Naive Bayes Algorithm

In this implementation of the Naive Bayes classifier following class conditional distributions are available: Bernoulli, Categorical, Gaussian, Poisson and non-parametric representation of the class conditional density estimated via Kernel Density Estimation. Implemented classifiers handle missing data and can take advantage of sparse data.

ashr — by Peter Carbonetto, 2 years ago

Methods for Adaptive Shrinkage, using Empirical Bayes

The R package 'ashr' implements an Empirical Bayes approach for large-scale hypothesis testing and false discovery rate (FDR) estimation based on the methods proposed in M. Stephens, 2016, "False discovery rates: a new deal", . These methods can be applied whenever two sets of summary statistics---estimated effects and standard errors---are available, just as 'qvalue' can be applied to previously computed p-values. Two main interfaces are provided: ash(), which is more user-friendly; and ash.workhorse(), which has more options and is geared toward advanced users. The ash() and ash.workhorse() also provides a flexible modeling interface that can accommodate a variety of likelihoods (e.g., normal, Poisson) and mixture priors (e.g., uniform, normal).

KCSNBShiny — by Karne Chaithanya Sai, 2 years ago

Naive Bayes Classifier

Predicts any variable in any categorical dataset for given values of predictor variables. If a dataset contains 4 variables, then any variable can be predicted based on the values of the other three variables given by the user. The user can upload their own datasets and select what variable they want to predict. A 'handsontable' is provided to enter the predictor values and also accuracy of the prediction is also shown.

bbl — by Jun Woo, 8 months ago

Boltzmann Bayes Learner

Supervised learning using Boltzmann Bayes model inference, which extends naive Bayes model to include interactions. Enables classification of data into multiple response groups based on a large number of discrete predictors that can take factor values of heterogeneous levels. Either pseudo-likelihood or mean field inference can be used with L2 regularization, cross-validation, and prediction on new data. Woo et al. (2016) .

EBrank — by John Ferguson, 5 years ago

Empirical Bayes Ranking

Empirical Bayes ranking applicable to parallel-estimation settings where the estimated parameters are asymptotically unbiased and normal, with known standard errors. A mixture normal prior for each parameter is estimated using Empirical Bayes methods, subsequentially ranks for each parameter are simulated from the resulting joint posterior over all parameters (The marginal posterior densities for each parameter are assumed independent). Finally, experiments are ordered by expected posterior rank, although computations minimizing other plausible rank-loss functions are also given.