Estimates the transfer entropy from one time series to another, where each time series consists of continuous random variables. The transfer entropy is an extension of mutual information which takes into account the direction of information flow, under the assumption that the underlying processes can be described by a Markov model. Two estimation methods are provided. The first calculates transfer entropy as the difference of mutual information. Mutual information is estimated using the Kraskov method, which builds on a nearest-neighbor framework (see package references). The second estimation method estimate transfer entropy via the a generalized correlation sum.
|Depends:||R (≥ 3.0.0)|
|Imports:||Rcpp (≥ 0.11.6)|
|Author:||ANN Library: David Mount, Sunil Arya (see src/ann_1.1.2/Copyright.txt). Transfer Entropy Packge: Ghazaleh Haratinezhad Torbati, Glenn Lawyer.|
|Maintainer:||Ghazaleh Haratinezhad Torbati <ghazale.hnt at gmail.com>|
|License:||GPL-2 | GPL-3 | file LICENSE [expanded from: GPL (≥ 2) | file LICENSE]|
|Copyright:||see file COPYRIGHTS|
|CRAN checks:||TransferEntropy results|
|Windows binaries:||r-devel: TransferEntropy_1.4.zip, r-release: TransferEntropy_1.4.zip, r-oldrel: TransferEntropy_1.4.zip|
|OS X Mavericks binaries:||r-release: TransferEntropy_1.4.tgz, r-oldrel: TransferEntropy_1.4.tgz|
|Old sources:||TransferEntropy archive|
Please use the canonical form https://CRAN.R-project.org/package=TransferEntropy to link to this page.