We study the complexity of marginal inference with Relational Bayesian Networks as parameterized by their probability formulas. We show that without combination functions, inference is #P-equivalent, displaying the same complexity as standard Bayesian networks (this is so even when relations have unbounded arity and when the domain is succinctly specified in binary notation). By allowing increasingly more expressive probability formulas using only maximization as combination, we obtain inferential complexity that ranges from #P-equivalent to FPSPACE-complete to EXP-hard. In fact, by suitable restrictions to the number of nestings of combination functions, we obtain complexity classes in all levels of the counting hierarchy. Finally, we investigate the use of arbitrary combination functions and obtain that inference is FEXP-complete even under a seemingly strong restriction.
|Número de páginas||12|
|Publicación||Journal of Machine Learning Research|
|Estado||Publicada - 2016|
|Publicado de forma externa||Sí|
|Evento||8th International Conference on Probabilistic Graphical Models, PGM 2016 - Lugano, Suiza|
Duración: 6 sep 2016 → 9 sep 2016