Found 1893 packages in 0.17 seconds
Models Multivariate Cases Using Random Forests
Models and predicts multiple output features in single random forest considering the
linear relation among the output features, see details in Rahman et al (2017)
Extensible, Parallelizable Implementation of the Random Forest Algorithm
Scalable implementation of classification and regression forests, as described by Breiman (2001),
Autoencoding Random Forests
Autoencoding Random Forests ('RFAE') provide a method to
autoencode mixed-type tabular data using Random Forests ('RF'), which
involves projecting the data to a latent feature space of user-chosen
dimensionality (usually a lower dimension), and then decoding the latent
representations back into the input space. The encoding stage is useful for
feature engineering and data visualisation tasks, akin to how principal
component analysis ('PCA') is used, and the decoding stage is useful
for compression and denoising tasks. At its core, 'RFAE' is a
post-processing pipeline on a trained random forest model. This means
that it can accept any trained RF of 'ranger' object type: 'RF', 'URF' or
'ARF'. Because of this, it inherits Random Forests' robust performance and
capacity to seamlessly handle mixed-type tabular data. For more details, see
Vu et al. (2025)
Distributional Random Forests
An implementation of distributional random forests as introduced in Cevid & Michel & Naf & Meinshausen & Buhlmann (2022)
Ordered Random Forests
An implementation of the Ordered Forest estimator as developed
in Lechner & Okasa (2019)
"Dirichlet Random Forest"
Implementation of the Dirichlet Random Forest algorithm for compositional response data. Supports maximum likelihood estimation ('MLE') and method-of-moments ('MOM') parameter estimation for the Dirichlet distribution. Provides two prediction strategies; averaging-based predictions (average of responses within terminal nodes) and parameter-based predictions (expected value derived from the estimated Dirichlet parameters within terminal nodes). For more details see Masoumifard, van der Westhuizen, and Gardner-Lubbe (2026, ISBN:9781032903910).
Modified Ordered Random Forest
Nonparametric estimator of the ordered choice model using random forests. The estimator modifies a standard random forest splitting criterion to build a collection of forests, each estimating the conditional probability of a single class. The package also implements a nonparametric estimator of the covariates’ marginal effects.
Random Forests for Longitudinal Data
Random forests are a statistical learning method widely used in many areas of scientific research essentially for its ability to learn complex relationships between input and output variables and also its capacity to handle high-dimensional data. However, current random forests approaches are not flexible enough to handle longitudinal data. In this package, we propose a general approach of random forests for high-dimensional longitudinal data. It includes a flexible stochastic model which allows the covariance structure to vary over time. Furthermore, we introduce a new method which takes intra-individual covariance into consideration to build random forests. The method is fully detailled in Capitaine et.al. (2020)
Visually Exploring Random Forests
Graphic elements for exploring Random Forests using the 'randomForest' or 'randomForestSRC' package for survival, regression and classification forests and 'ggplot2' package plotting.
Permutation Significance for Random Forests
Estimate False Discovery Rates (FDRs) for importance metrics from random forest runs.