analysisPipelines (1.0.0)

0 users

Compose Interoperable Analysis Pipelines & Put Them in Production.

Enables data scientists to compose pipelines of analysis which consist of data manipulation, exploratory analysis & reporting, as well as modeling steps. Data scientists can use tools of their choice through an R interface, and compose interoperable pipelines between R, Spark, and Python. Credits to Mu Sigma for supporting the development of the package. Note - To enable pipelines involving Spark tasks, the package uses the 'SparkR' package. The SparkR package needs to be installed to use Spark as an engine within a pipeline. SparkR is distributed natively with Apache Spark and is not distributed on CRAN. The SparkR version needs to directly map to the Spark version (hence the native distribution), and care needs to be taken to ensure that this is configured properly. To install SparkR from Github, run the following command if you know the Spark version: 'devtools::install_github('apache/spark@v2.x.x', subdir='R/pkg')'. The other option is to install SparkR by running the following terminal commands if Spark has already been installed: '$ export SPARK_HOME=/path/to/spark/directory && cd $SPARK_HOME/R/lib/SparkR/ && R -e "devtools::install('.')"'.

Maintainer: "Mu Sigma, Inc."
Author(s): Naren Srinivasan [aut], Zubin Dowlaty [aut], Sanjay [ctb], Neeratyoy Mallik [ctb], Anoop S [ctb], Mu Sigma, Inc. [cre]

License: Apache License 2.0

Uses: devtools, dplyr, futile.logger, ggplot2, magrittr, pipeR, proto, purrr, RCurl, rlang, car, foreign, rjson, corrplot, knitr, R.devices, shiny, rmarkdown, DT, visNetwork, plotly
Enhances: reticulate, SparkR

Released about 1 year ago.



  (0 votes)


  (0 votes)

Log in to vote.


No one has written a review of analysisPipelines yet. Want to be the first? Write one now.

Related packages:(20 best matches, based on common tags.)

Search for analysisPipelines on google, google scholar, r-help, r-devel.

Visit analysisPipelines on R Graphical Manual.