Complex machine learning models are often hard to interpret. However, in many situations it is crucial to understand and explain why a model made a specific prediction. Shapley values is the only method for such prediction explanation framework with a solid theoretical foundation. Previously known methods for estimating the Shapley values do, however, assume feature independence. This package implements the method described in Aas, Jullum and Løland (2019) <arXiv:1903.10464>, which accounts for any feature dependence, and thereby produces more accurate estimates of the true Shapley values.
| Version: | 0.1.3 |
| Depends: | R (≥ 3.5.0) |
| Imports: | stats, data.table, Rcpp (≥ 0.12.15), condMVNorm, mvnfast, Matrix |
| LinkingTo: | RcppArmadillo, Rcpp |
| Suggests: | ranger, xgboost, mgcv, testthat, knitr, rmarkdown, roxygen2, MASS, ggplot2, gbm |
| Published: | 2020-09-03 |
| Author: | Nikolai Sellereite
|
| Maintainer: | Martin Jullum <Martin.Jullum at nr.no> |
| BugReports: | https://github.com/NorskRegnesentral/shapr/issues |
| License: | MIT + file LICENSE |
| URL: | https://norskregnesentral.github.io/shapr/, https://github.com/NorskRegnesentral/shapr |
| NeedsCompilation: | yes |
| Language: | en-US |
| Materials: | README NEWS |
| CRAN checks: | shapr results |
| Reference manual: | shapr.pdf |
| Vignettes: |
'shapr': Explaining individual machine learning predictions with Shapley values |
| Package source: | shapr_0.1.3.tar.gz |
| Windows binaries: | r-devel: shapr_0.1.3.zip, r-release: shapr_0.1.3.zip, r-oldrel: shapr_0.1.3.zip |
| macOS binaries: | r-release: shapr_0.1.3.tgz, r-oldrel: shapr_0.1.3.tgz |
| Old sources: | shapr archive |
Please use the canonical form https://CRAN.R-project.org/package=shapr to link to this page.