dsiNMF
)In this vignette, we consider approximating non-negative multiple matrices as a product of binary (or non-negative) low-rank matrices (a.k.a., factor matrices).
Test data is available from toyModel
.
library("dcTensor")
<- dcTensor::toyModel("dsiNMF_Easy") X
You will see that there are some blocks in the data matrices as follows.
suppressMessages(library("fields"))
layout(t(1:3))
image.plot(X[[1]], main="X1", legend.mar=8)
image.plot(X[[2]], main="X2", legend.mar=8)
image.plot(X[[3]], main="X3", legend.mar=8)
Here, we consider the approximation of \(K\) binary data matrices \(X_{k}\) (\(N \times M_{k}\)) as the matrix product of \(W\) (\(N \times J\)) and \(V_{k}\) (J \(M_{k}\)):
\[ X_{k} \approx W H_{k} \ \mathrm{s.t.}\ W,H_{k} \in \{0,1\} \]
This is the combination of binary matrix factorization (BMF (Z. et al. Zhang 2007)) and simultaneous non-negative matrix decomposition (siNMF (Badea 2008; S. et al. Zhang 2012; Yilmaz 2010; CICHOCK 2009)), which is implemented by adding binary regularization against siNMF.
For the details of arguments of dsiNMF, see ?dsiNMF
.
After the calculation, various objects are returned by
dsiNMF
.
See also siNMF
function of nnTensor
package.
In BSMF, a rank parameter \(J\)
(\(\leq \min(N, M)\)) is needed to be
set in advance. Other settings such as the number of iterations
(num.iter
) or factorization algorithm
(algorithm
) are also available. For the details of
arguments of dsiNMF, see ?dsiNMF
. After the calculation,
various objects are returned by dsiNMF
. BSMF is achieved by
specifying the binary regularization parameter as a large value like the
below:
set.seed(123456)
<- dsiNMF(X, Bin_W=1E+1, Bin_H=c(1E+1, 1E+1, 1E+1), J=3)
out_dsiNMF str(out_dsiNMF, 2)
## List of 6
## $ W : num [1:100, 1:3] 0.0479 0.0479 0.0479 0.0479 0.0479 ...
## $ H :List of 3
## ..$ : num [1:300, 1:3] 0.00208 0.00206 0.00209 0.0021 0.00206 ...
## ..$ : num [1:200, 1:3] 3.11e-244 1.18e-243 4.12e-244 5.19e-244 2.46e-243 ...
## ..$ : num [1:150, 1:3] 0.997 0.997 0.997 0.997 0.997 ...
## $ RecError : Named num [1:101] 1.00e-09 1.24e+02 1.16e+02 1.11e+02 1.09e+02 ...
## ..- attr(*, "names")= chr [1:101] "offset" "1" "2" "3" ...
## $ TrainRecError: Named num [1:101] 1.00e-09 1.24e+02 1.16e+02 1.11e+02 1.09e+02 ...
## ..- attr(*, "names")= chr [1:101] "offset" "1" "2" "3" ...
## $ TestRecError : Named num [1:101] 1e-09 0e+00 0e+00 0e+00 0e+00 0e+00 0e+00 0e+00 0e+00 0e+00 ...
## ..- attr(*, "names")= chr [1:101] "offset" "1" "2" "3" ...
## $ RelChange : Named num [1:101] 1.00e-09 5.59e-01 7.01e-02 3.87e-02 2.20e-02 ...
## ..- attr(*, "names")= chr [1:101] "offset" "1" "2" "3" ...
The reconstruction error (RecError
) and relative error
(RelChange
, the amount of change from the reconstruction
error in the previous step) can be used to diagnose whether the
calculation is converged or not.
layout(t(1:2))
plot(log10(out_dsiNMF$RecError[-1]), type="b", main="Reconstruction Error")
plot(log10(out_dsiNMF$RelChange[-1]), type="b", main="Relative Change")
The products of \(W\) and \(H_{k}\)s show whether the original data
marices are well-recovered by dsiNMF
.
<- lapply(seq_along(X), function(x){
recX $W %*% t(out_dsiNMF$H[[x]])
out_dsiNMF
})layout(rbind(1:3, 4:6))
image.plot(X[[1]], main="X1", legend.mar=8)
image.plot(X[[2]], main="X2", legend.mar=8)
image.plot(X[[3]], main="X3", legend.mar=8)
image.plot(recX[[1]], main="Reconstructed X1", legend.mar=8)
image.plot(recX[[2]], main="Reconstructed X2", legend.mar=8)
image.plot(recX[[3]], main="Reconstructed X3", legend.mar=8)
The histograms of \(H_{k}\)s show that \(H_{k}\)s look binary.
layout(rbind(1:2, 3:4))
hist(out_dsiNMF$W, main="W", breaks=100)
hist(out_dsiNMF$H[[1]], main="H1", breaks=100)
hist(out_dsiNMF$H[[2]], main="H2", breaks=100)
hist(out_dsiNMF$H[[3]], main="H3", breaks=100)
Semi-Binary Simultaneous Matrix Factorization (SBSMF) is an extension of BSMF; we can select specific factor matrix (or matrices).
To demonstrate SBSMF, next we use non-negative matrices from the
nnTensor
package.
suppressMessages(library("nnTensor"))
<- nnTensor::toyModel("siNMF_Easy")
X2 layout(t(1:3))
image.plot(X2[[1]], main="X1", legend.mar=8)
image.plot(X2[[2]], main="X2", legend.mar=8)
image.plot(X2[[3]], main="X3", legend.mar=8)
In SBSMF, a rank parameter \(J\)
(\(\leq \min(N, M)\)) is needed to be
set in advance. Other settings such as the number of iterations
(num.iter
) or factorization algorithm
(algorithm
) are also available. For the details of
arguments of dsiNMF, see ?dsiNMF
. After the calculation,
various objects are returned by dsiNMF
. SBSMF is achieved
by specifying the binary regularization parameter as a large value like
the below:
set.seed(123456)
<- dsiNMF(X2, Bin_W=1E+2, J=3)
out_dsiNMF2 str(out_dsiNMF2, 2)
## List of 6
## $ W : num [1:100, 1:3] 0.0988 0.1006 0.1056 0.1023 0.1003 ...
## $ H :List of 3
## ..$ : num [1:300, 1:3] 5.43e-10 3.89e-10 7.38e-10 2.05e-09 5.05e-10 ...
## ..$ : num [1:200, 1:3] 1.46e-15 6.40e-15 6.54e-15 7.36e-15 2.28e-14 ...
## ..$ : num [1:150, 1:3] 95.6 92.7 94 96.2 95.1 ...
## $ RecError : Named num [1:101] 1.00e-09 1.17e+04 1.14e+04 1.10e+04 1.08e+04 ...
## ..- attr(*, "names")= chr [1:101] "offset" "1" "2" "3" ...
## $ TrainRecError: Named num [1:101] 1.00e-09 1.17e+04 1.14e+04 1.10e+04 1.08e+04 ...
## ..- attr(*, "names")= chr [1:101] "offset" "1" "2" "3" ...
## $ TestRecError : Named num [1:101] 1e-09 0e+00 0e+00 0e+00 0e+00 0e+00 0e+00 0e+00 0e+00 0e+00 ...
## ..- attr(*, "names")= chr [1:101] "offset" "1" "2" "3" ...
## $ RelChange : Named num [1:101] 1.00e-09 1.17e-01 2.89e-02 3.68e-02 1.26e-02 ...
## ..- attr(*, "names")= chr [1:101] "offset" "1" "2" "3" ...
RecError
and RelChange
can be used to
diagnose whether the calculation is converged or not.
layout(t(1:2))
plot(log10(out_dsiNMF2$RecError[-1]), type="b", main="Reconstruction Error")
plot(log10(out_dsiNMF2$RelChange[-1]), type="b", main="Relative Change")
The products of \(W\) and \(H_{k}\)s show whether the original data is
well-recovered by dsiNMF
.
<- lapply(seq_along(X2), function(x){
recX $W %*% t(out_dsiNMF2$H[[x]])
out_dsiNMF2
})layout(rbind(1:3, 4:6))
image.plot(X2[[1]], main="X1", legend.mar=8)
image.plot(X2[[2]], main="X2", legend.mar=8)
image.plot(X2[[3]], main="X3", legend.mar=8)
image.plot(recX[[1]], main="Reconstructed X1", legend.mar=8)
image.plot(recX[[2]], main="Reconstructed X2", legend.mar=8)
image.plot(recX[[3]], main="Reconstructed X3", legend.mar=8)
The histograms of \(H_{k}\)s show that all the factor matrices \(H_{k}\)s look binary.
layout(rbind(1:2, 3:4))
hist(out_dsiNMF2$W, breaks=100)
hist(out_dsiNMF2$H[[1]], main="H1", breaks=100)
hist(out_dsiNMF2$H[[2]], main="H2", breaks=100)
hist(out_dsiNMF2$H[[3]], main="H3", breaks=100)
## R version 3.6.3 (2020-02-29)
## Platform: x86_64-conda-linux-gnu (64-bit)
## Running under: CentOS Linux 7 (Core)
##
## Matrix products: default
## BLAS/LAPACK: /home/koki/miniconda3/lib/libopenblasp-r0.3.17.so
##
## locale:
## [1] LC_CTYPE=en_US.UTF-8 LC_NUMERIC=C
## [3] LC_TIME=en_US.UTF-8 LC_COLLATE=en_US.UTF-8
## [5] LC_MONETARY=en_US.UTF-8 LC_MESSAGES=en_US.UTF-8
## [7] LC_PAPER=en_US.UTF-8 LC_NAME=C
## [9] LC_ADDRESS=C LC_TELEPHONE=C
## [11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C
##
## attached base packages:
## [1] stats graphics grDevices utils datasets methods base
##
## other attached packages:
## [1] nnTensor_1.1.12 fields_13.3 viridis_0.6.2 viridisLite_0.4.0
## [5] spam_2.8-0 dcTensor_1.0.1
##
## loaded via a namespace (and not attached):
## [1] Rcpp_1.0.8 highr_0.9 RColorBrewer_1.1-2 rTensor_1.4.8
## [5] bslib_0.3.1 compiler_3.6.3 pillar_1.7.0 jquerylib_0.1.4
## [9] tools_3.6.3 dotCall64_1.0-1 digest_0.6.29 jsonlite_1.8.0
## [13] evaluate_0.15 lifecycle_1.0.1 tibble_3.1.2 gtable_0.3.0
## [17] pkgconfig_2.0.3 rlang_0.4.11 DBI_1.1.2 yaml_2.3.5
## [21] xfun_0.29 fastmap_1.1.0 gridExtra_2.3 stringr_1.4.0
## [25] dplyr_1.0.6 knitr_1.37 generics_0.1.2 sass_0.4.0
## [29] vctrs_0.3.8 maps_3.4.0 plot3D_1.4 tidyselect_1.1.1
## [33] grid_3.6.3 glue_1.4.2 R6_2.5.1 fansi_1.0.2
## [37] tcltk_3.6.3 rmarkdown_2.11 purrr_0.3.4 ggplot2_3.3.5
## [41] magrittr_2.0.2 scales_1.1.1 htmltools_0.5.2 ellipsis_0.3.2
## [45] MASS_7.3-55 tagcloud_0.6 misc3d_0.9-1 assertthat_0.2.1
## [49] colorspace_2.0-3 utf8_1.2.2 stringi_1.7.6 munsell_0.5.0
## [53] crayon_1.5.0