geospark: Bring Local Sf to Spark

R binds 'GeoSpark' <http://geospark.datasyslab.org/> extending 'sparklyr' <https://spark.rstudio.com/> R package to make distributed 'geocomputing' easier. Sf is a package that provides [simple features] <https://en.wikipedia.org/wiki/Simple_Features> access for R and which is a leading 'geospatial' data processing tool. 'Geospark' R package bring the same simple features access like sf but running on Spark distributed system.

Version: 0.2.1
Depends: R (≥ 3.1.2)
Imports: sparklyr (≥ 1.0.0), dplyr (≥ 0.8.3), dbplyr (≥ 1.3.0)
Suggests: testthat, knitr, utils
Published: 2019-10-04
Author: Harry Zhu [aut, cre], Javier Luraschi [ctb]
Maintainer: Harry Zhu <7harryprince at gmail.com>
BugReports: https://github.com/harryprince/geospark/issues
License: Apache License (≥ 2.0)
NeedsCompilation: no
Materials: README
CRAN checks: geospark results

Downloads:

Reference manual: geospark.pdf
Package source: geospark_0.2.1.tar.gz
Windows binaries: r-devel: geospark_0.2.1.zip, r-release: geospark_0.2.1.zip, r-oldrel: geospark_0.2.1.zip
OS X binaries: r-release: geospark_0.2.1.tgz, r-oldrel: geospark_0.2.1.tgz

Linking:

Please use the canonical form https://CRAN.R-project.org/package=geospark to link to this page.