Abstract
We study the complexity of marginal inference with Relational Bayesian Networks as parameterized by their probability formulas. We show that without combination functions, inference is #P-equivalent, displaying the same complexity as standard Bayesian networks (this is so even when relations have unbounded arity and when the domain is succinctly specified in binary notation). By allowing increasingly more expressive probability formulas using only maximization as combination, we obtain inferential complexity that ranges from #P-equivalent to FPSPACE-complete to EXP-hard. In fact, by suitable restrictions to the number of nestings of combination functions, we obtain complexity classes in all levels of the counting hierarchy. Finally, we investigate the use of arbitrary combination functions and obtain that inference is FEXP-complete even under a seemingly strong restriction.
Original language | English |
---|---|
Pages (from-to) | 333-344 |
Number of pages | 12 |
Journal | Journal of Machine Learning Research |
Volume | 52 |
Issue number | 2016 |
State | Published - 2016 |
Externally published | Yes |
Event | 8th International Conference on Probabilistic Graphical Models, PGM 2016 - Lugano, Switzerland Duration: 6 Sep 2016 → 9 Sep 2016 |
Keywords
- Complexity theory
- Probabilistic inference
- Relational Bayesian networks