Probabilistic Reasoning with Incomplete and Inconsistent Beliefs

M. Thimm
Publication Date:
Mittwoch, 07. Dezember 2011
Book Series:
Dissertationen zur Künstlichen Intelligenz
Artificial Intelligence
Dissertationen zur Künstlichen Intelligenz
Complete Index AKA Publisher
Dissertations in Artificial Intelligence
55,00 €
inkl. 7% Tax
Reasoning with inaccurate information is a major topic within the fields of artificial intelligence in general and knowledge representation and reasoning in particular. This thesis deals with information that can be incomplete, uncertain, and contradictory. We employ probabilistic conditional logic which allows for the representation of uncertain pieces of information by using probabilistic conditionals, i.e. if-then-rules. Uncertainty can be expressed by means of probabilities attached to those rules and incompleteness can be handled in this framework by reasoning based on the principle of maximum entropy. In this thesis we focus on two major issues that arise when representing knowledge with probabilistic conditional logic. On the one hand, we look at the problem of contradictory information that, e.g., arises when multiple experts share their knowledge in order to come up with a common knowledge base consisting of probabilistic conditionals. As in classical logic this is a severe problem because inconsistency of a knowledge base forbids application of model-based inductive inference approaches such as reasoning based on the principle of maximum entropy. On the other hand, we investigate an extension of the syntactical and semantical notions of probabilistic conditional logic to the relational case. We also extend the approach of reasoning based on the principle of maximum entropy to the framework of relational probabilistic conditional logic and investigate its properties.