Concepts and algorithms for computing maximum entropy distributions for knowledge bases with relational probabilistic conditionals
Autoren
Mehr zum Buch
Many practical problems are concerned with incomplete and uncertain knowledge about domains where relations among different objects play an important role. Relational probabilistic conditionals provide an adequate way to express such uncertain, rule-like knowledge of the form „If A holds, then B holds with probability p“. Recently, the aggregating semantics for such conditionals has been proposed, which, combined with the principle of maximum entropy (ME), allows probabilistic reasoning in a relational domain. However, there exist no specialized algorithms which would allow performing ME reasoning under aggregating semantics in practice. The main topic of this publication is the development, implementation, evaluation, and improvement of the very first algorithms tailor-made for solving the ME optimization problem under aggregating semantics. We demonstrate how the equivalence of worlds can be exploited to compute the ME distribution more efficiently. We further introduce an algorithm which works on weighted conditional impacts (WCI) instead of worlds and we present a novel algorithm which computes the WCI of a conditional by employing combinatorial means. These algorithms allow us to process some larger examples which could not be computed before at all and can also be beneficial for other relational ME semantics.
Parameter
- ISBN
- 9783898383424