Exploiting tensor rank-one decomposition in probabilistic inference
Kybernetika, Tome 43 (2007) no. 5, pp. 747-764 Cet article a éte moissonné depuis la source Czech Digital Mathematics Library

Voir la notice de l'article

We propose a new additive decomposition of probability tables – tensor rank-one decomposition. The basic idea is to decompose a probability table into a series of tables, such that the table that is the sum of the series is equal to the original table. Each table in the series has the same domain as the original table but can be expressed as a product of one- dimensional tables. Entries in tables are allowed to be any real number, i. e. they can be also negative numbers. The possibility of having negative numbers, in contrast to a multiplicative decomposition, opens new possibilities for a compact representation of probability tables. We show that tensor rank-one decomposition can be used to reduce the space and time requirements in probabilistic inference. We provide a closed form solution for minimal tensor rank-one decomposition for some special tables and propose a numerical algorithm that can be used in cases when the closed form solution is not known.
We propose a new additive decomposition of probability tables – tensor rank-one decomposition. The basic idea is to decompose a probability table into a series of tables, such that the table that is the sum of the series is equal to the original table. Each table in the series has the same domain as the original table but can be expressed as a product of one- dimensional tables. Entries in tables are allowed to be any real number, i. e. they can be also negative numbers. The possibility of having negative numbers, in contrast to a multiplicative decomposition, opens new possibilities for a compact representation of probability tables. We show that tensor rank-one decomposition can be used to reduce the space and time requirements in probabilistic inference. We provide a closed form solution for minimal tensor rank-one decomposition for some special tables and propose a numerical algorithm that can be used in cases when the closed form solution is not known.
Classification : 15A69, 62E15, 68T37
Keywords: graphical probabilistic models; probabilistic inference; tensor rank
@article{KYB_2007_43_5_a10,
     author = {Savicky, Petr and Vomlel, Ji\v{r}{\'\i}},
     title = {Exploiting tensor rank-one decomposition in probabilistic inference},
     journal = {Kybernetika},
     pages = {747--764},
     year = {2007},
     volume = {43},
     number = {5},
     mrnumber = {2376335},
     zbl = {1148.68539},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/KYB_2007_43_5_a10/}
}
TY  - JOUR
AU  - Savicky, Petr
AU  - Vomlel, Jiří
TI  - Exploiting tensor rank-one decomposition in probabilistic inference
JO  - Kybernetika
PY  - 2007
SP  - 747
EP  - 764
VL  - 43
IS  - 5
UR  - http://geodesic.mathdoc.fr/item/KYB_2007_43_5_a10/
LA  - en
ID  - KYB_2007_43_5_a10
ER  - 
%0 Journal Article
%A Savicky, Petr
%A Vomlel, Jiří
%T Exploiting tensor rank-one decomposition in probabilistic inference
%J Kybernetika
%D 2007
%P 747-764
%V 43
%N 5
%U http://geodesic.mathdoc.fr/item/KYB_2007_43_5_a10/
%G en
%F KYB_2007_43_5_a10
Savicky, Petr; Vomlel, Jiří. Exploiting tensor rank-one decomposition in probabilistic inference. Kybernetika, Tome 43 (2007) no. 5, pp. 747-764. http://geodesic.mathdoc.fr/item/KYB_2007_43_5_a10/

[1] Chavira M., Darwiche A.: Compiling Bayesian networks with local structure. In: Proc. 19th Internat. Joint Conference on Artificial Intelligence (IJCAI), Edinburgh 2005, pp. 1306–1312

[2] Darwiche A.: A differential approach to inference in Bayesian networks. J. Assoc. Comput. Mach. 50 (2003), 3, 280–305 | MR

[3] Lathauwer L. De, Moor B. De: From matrix to tensor: multilinear algebra and signal processing. In: 4th Internat. Conference on Mathematics in Signal Processing, Part I, IMA Conference Series, Warwick 1996, pp. 1–11

[4] Lathauwer L. De, Moor, B. De, Vandewalle J.: On the best Rank-1 and Rank-$(R_1,R_2,\ldots ,R_N)$ approximation of higher-order tensors. SIAM J. Matrix Anal. Appl. 21 (2000), 4, 1324–1342 | MR

[5] Díez F. J., Galán S. F.: An efficient factorization for the noisy MAX. Internat. J. Intell. Systems 18 (2003), 2, 165–177

[6] Golub G. H., Loan C. F. Van: Matrix Computations. Third edition. Johns Hopkins University Press, Baltimore 1996 | MR

[7] Heckerman D.: A tractable inference algorithm for diagnosing multiple diseases. In: Proc. Fifth Annual Conference on Uncertainty in AI (M. Henrion, R. D. Shachter, L. N. Kanal, and J. F. Lemmer, eds.), August 18–21, 1989, Windsor, ON, pp. 163–171

[8] Heckerman D.: Causal independence for knowledge acquisition and inference. In: Proc. Ninth Conference on Uncertainty in AI (D. Heckerman and A. Mamdani, eds.), July 9–11, 1993, Washington, D.C., pp. 122–127

[9] Heckerman D., Breese J. S.: A new look at causal independence. In: Proc. Tenth Conference on Uncertainty in AI (R. Lopez de Mantaras and D. Poole, eds.), July 29–31, 1994, Seattle, WA, pp. 286–292

[10] Håstad J.: Tensor Rank is NP-complete. J. Algorithms 11 (1990), 644–654 | MR | Zbl

[11] Jensen F. V.: Bayesian Networks and Decision Graphs. (Statistics for Engineering and Information Science.) Springer–Verlag, New York – Berlin – Heidelberg 2001 | MR

[12] Jensen F. V., Lauritzen S. L., Olesen K. G.: Bayesian updating in recursive graphical models by local computation. Computat. Statist. Quart. 4 (1990), 269–282 | MR

[13] Lauritzen S. L.: Graphical Models. Clarendon Press, Oxford 1996 | MR | Zbl

[14] Olesen K. G., Kjærulff U., Jensen F., Jensen F. V., Falck B., Andreassen S., Andersen S. K.: A MUNIN network for the median nerve – a case study on loops. Appl. Artif. Intell., Special issue: Towards Causal AI Models in Practice 3 (1989), 384–403

[15] Polak E.: Computational Methods in Optimization: A Unified Approach. Academic Press, New York 1971 | MR | Zbl

[16] Takikawa M., D’Ambrosio B.: Multiplicative factorization of noisy-max. In: Proc. Fifteenth Conference on Uncertainty in AI (K. B. Laskey and H. Prade, eds.), July 30 – August 1, 1999, Stockholm, pp. 622–630

[17] Vomlel J.: Exploiting functional dependence in Bayesian network inference. In: Proc. Eighteenth Conference on Uncertainty in AI (UAI) – Edmonton (Canada), Morgan Kaufmann, San Francisco 2002, pp. 528–535

[18] Zhang N. L., Poole D.: Exploiting causal independence in Bayesian network inference. J. Artif. Intell. Res. 5 (1996), 301–328 | MR | Zbl