Examples: visualization, C++, networks, data cleaning, html widgets, ropensci.

Found 116 packages in 0.01 seconds

BayesFactor — by Richard D. Morey, 2 years ago

Computation of Bayes Factors for Common Designs

A suite of functions for computing various Bayes factors for simple designs, including contingency tables, one- and two-sample designs, one-way designs, general ANOVA designs, and linear regression.

e1071 — by David Meyer, 4 months ago

Misc Functions of the Department of Statistics, Probability Theory Group (Formerly: E1071), TU Wien

Functions for latent class analysis, short time Fourier transform, fuzzy clustering, support vector machines, shortest path computation, bagged clustering, naive Bayes classifier, ...

EbayesThresh — by Peter Carbonetto, 3 years ago

Empirical Bayes Thresholding and Related Methods

Empirical Bayes thresholding using the methods developed by I. M. Johnstone and B. W. Silverman. The basic problem is to estimate a mean vector given a vector of observations of the mean vector plus white noise, taking advantage of possible sparsity in the mean vector. Within a Bayesian formulation, the elements of the mean vector are modelled as having, independently, a distribution that is a mixture of an atom of probability at zero and a suitable heavy-tailed distribution. The mixing parameter can be estimated by a marginal maximum likelihood approach. This leads to an adaptive thresholding approach on the original data. Extensions of the basic method, in particular to wavelet thresholding, are also implemented within the package.

bridgesampling — by Quentin F. Gronau, a month ago

Bridge Sampling for Marginal Likelihoods and Bayes Factors

Provides functions for estimating marginal likelihoods, Bayes factors, posterior model probabilities, and normalizing constants in general, via different versions of bridge sampling (Meng & Wong, 1996, < http://www3.stat.sinica.edu.tw/statistica/j6n4/j6n43/j6n43.htm>). Gronau, Singmann, & Wagenmakers (2020) .

naivebayes — by Michal Majka, 23 days ago

High Performance Implementation of the Naive Bayes Algorithm

In this implementation of the Naive Bayes classifier following class conditional distributions are available: Bernoulli, Categorical, Gaussian, Poisson and non-parametric representation of the class conditional density estimated via Kernel Density Estimation. Implemented classifiers handle missing data and can take advantage of sparse data.

REBayes — by Roger Koenker, 2 months ago

Empirical Bayes Estimation and Inference

Kiefer-Wolfowitz maximum likelihood estimation for mixture models and some other density estimation and regression methods based on convex optimization. See Koenker and Gu (2017) REBayes: An R Package for Empirical Bayes Mixture Methods, Journal of Statistical Software, 82, 1--26, .

bain — by Caspar J van Lissa, 23 days ago

Bayes Factors for Informative Hypotheses

Computes approximated adjusted fractional Bayes factors for equality, inequality, and about equality constrained hypotheses. S3 methods are available for specific types of lm() models, namely ANOVA, ANCOVA, and multiple regression, and for the t_test(). The statistical underpinnings are described in Gu, Mulder, and Hoijtink, (2018) , Hoijtink, Gu, and Mulder, (2018) , and Hoijtink, Gu, Mulder, and Rosseel, (2018) .

geostatsp — by Patrick Brown, 2 months ago

Geostatistical Modelling with Likelihood and Bayes

Geostatistical modelling facilities using Raster and SpatialPoints objects are provided. Non-Gaussian models are fit using INLA, and Gaussian geostatistical models use Maximum Likelihood Estimation. For details see Brown (2015) .

ashr — by Peter Carbonetto, a month ago

Methods for Adaptive Shrinkage, using Empirical Bayes

The R package 'ashr' implements an Empirical Bayes approach for large-scale hypothesis testing and false discovery rate (FDR) estimation based on the methods proposed in M. Stephens, 2016, "False discovery rates: a new deal", . These methods can be applied whenever two sets of summary statistics---estimated effects and standard errors---are available, just as 'qvalue' can be applied to previously computed p-values. Two main interfaces are provided: ash(), which is more user-friendly; and ash.workhorse(), which has more options and is geared toward advanced users. The ash() and ash.workhorse() also provides a flexible modeling interface that can accommodate a variety of likelihoods (e.g., normal, Poisson) and mixture priors (e.g., uniform, normal).

KCSNBShiny — by Karne Chaithanya Sai, 9 months ago

Naive Bayes Classifier

Predicts any variable in any categorical dataset for given values of predictor variables. If a dataset contains 4 variables, then any variable can be predicted based on the values of the other three variables given by the user. The user can upload their own datasets and select what variable they want to predict. A 'handsontable' is provided to enter the predictor values and also accuracy of the prediction is also shown.