Examples: visualization, C++, networks, data cleaning, html widgets, ropensci.

Found 439 packages in 0.18 seconds

dcmodify — by Mark van der Loo, 2 years ago

Modify Data Using Externally Defined Modification Rules

Data cleaning scripts typically contain a lot of 'if this change that' type of statements. Such statements are typically condensed expert knowledge. With this package, such 'data modifying rules' are taken out of the code and become in stead parameters to the work flow. This allows one to maintain, document, and reason about data modification rules as separate entities.

gsmoothr — by Mark Robinson, 12 years ago

Smoothing tools

Tools rewritten in C for various smoothing tasks

fMRItools — by Amanda Mejia, 2 months ago

Routines for Common fMRI Processing Tasks

Supports fMRI (functional magnetic resonance imaging) analysis tasks including reading in 'CIFTI', 'GIFTI' and 'NIFTI' data, temporal filtering, nuisance regression, and aCompCor (anatomical Components Correction) (Muschelli et al. (2014) ).

dodgr — by Mark Padgham, 6 months ago

Distances on Directed Graphs

Distances on dual-weighted directed graphs using priority-queue shortest paths (Padgham (2019) ). Weighted directed graphs have weights from A to B which may differ from those from B to A. Dual-weighted directed graphs have two sets of such weights. A canonical example is a street network to be used for routing in which routes are calculated by weighting distances according to the type of way and mode of transport, yet lengths of routes must be calculated from direct distances.

radsafer — by Mark Hogue, 3 months ago

Radiation Safety

Provides functions for radiation safety, also known as "radiation protection" and "radiological control". The science of radiation protection is called "health physics" and its engineering functions are called "radiological engineering". Functions in this package cover many of the computations needed by radiation safety professionals. Examples include: obtaining updated calibration and source check values for radiation monitors to account for radioactive decay in a reference source, simulating instrument readings to better understand measurement uncertainty, correcting instrument readings for geometry and ambient atmospheric conditions. Many of these functions are described in Johnson and Kirby (2011, ISBN-13: 978-1609134198). Utilities are also included for developing inputs and processing outputs with radiation transport codes, such as MCNP, a general-purpose Monte Carlo N-Particle code that can be used for neutron, photon, electron, or coupled neutron/photon/electron transport (Werner et. al. (2018) ).

constellation — by Mark Sendak, 8 years ago

Identify Event Sequences Using Time Series Joins

Examine any number of time series data frames to identify instances in which various criteria are met within specified time frames. In clinical medicine, these types of events are often called "constellations of signs and symptoms", because a single condition depends on a series of events occurring within a certain amount of time of each other. This package was written to work with any number of time series data frames and is optimized for speed to work well with data frames with millions of rows.

whitechapelR — by Mark Ewing, 7 years ago

Advanced Policing Techniques for the Board Game "Letters from Whitechapel"

Provides a set of functions to make tracking the hidden movements of the 'Jack' player easier. By tracking every possible path Jack might have traveled from the point of the initial murder including special movement such as through alleyways and via carriages, the police can more accurately narrow the field of their search. Additionally, by tracking all possible hideouts from round to round, rounds 3 and 4 should have a vastly reduced field of search.

xpectr — by Ludvig Renbo Olsen, a year ago

Generates Expectations for 'testthat' Unit Testing

Helps systematize and ease the process of building unit tests with the 'testthat' package by providing tools for generating expectations.

piggyback — by Carl Boettiger, 3 years ago

Managing Larger Data on a GitHub Repository

Because larger (> 50 MB) data files cannot easily be committed to git, a different approach is required to manage data associated with an analysis in a GitHub repository. This package provides a simple work-around by allowing larger (up to 2 GB) data files to piggyback on a repository as assets attached to individual GitHub releases. These files are not handled by git in any way, but instead are uploaded, downloaded, or edited directly by calls through the GitHub API. These data files can be versioned manually by creating different releases. This approach works equally well with public or private repositories. Data can be uploaded and downloaded programmatically from scripts. No authentication is required to download data from public repositories.

bwsTools — by Mark White, 6 years ago

Tools for Case 1 Best-Worst Scaling (MaxDiff) Designs

Tools to design best-worst scaling designs (i.e., balanced incomplete block designs) and to analyze data from these designs, using aggregate and individual methods such as: difference scores, Louviere, Lings, Islam, Gudergan, & Flynn (2013) ; analytical estimation, Lipovetsky & Conklin (2014) ; empirical Bayes, Lipovetsky & Conklin (2015) ; Elo, Hollis (2018) ; and network-based measures.