Task32

From Kolflow Project

Jump to: navigation, search


Refering task : Task3

Title : Human-Computer Interactions for Managing Consistency.

Code : 3.2

Start Date : 2011-03-01

End Date : 2013-02-28


Description : In order to address this task, two kinds of inconsistencies have to be distinguished. The first one, referred to as ”logical inconsistency”, is the same as the inconsistency mentioned in T3.1: it corresponds to a situation of contradiction in a knowledge base (or in the conjunction of knowledge bases). The second one, referred to as ”human-machine inconsistency” occurs when an inference drawn from a (logically consistent) knowledge base is in contradiction with the beliefs of a user. In order to deal with logical inconsistencies, the idea is first to identify precisely what parts of KB1 and KB2 lead to inconsistency of KB1 ∪ KB2. If an expressive description logic (i.e., a description logic ”containing” propositional logic, which is the case for OWL DL) is used, the deductive inferences are in general based on a semantic tableau method (see, e.g., [15]). This method leads to clashes, i.e., to elementary contradictions that can be presented to a user for repair. In order to deal with human-machine inconsistencies, we plan to reuse the principles of the FikA approach [33] (Failure-driven Interactive Knowledge Acquisition). This approach has been applied to case-based reasoning, but can be applied to other types of knowledge-based systems. Basically, it consists in acquiring and correcting the knowledge base of a system through interactions with a user, these interactions being triggered by the fact that the result of an inference is considered by the user to be a failure. Here, we are interested in human-machine inconsistency failures. Applying the FikA approach in this context consists in explaining the inferences to the user, e.g., in the form of a proof: if a step of the proof involves a piece of knowledge that raises a human-machine incoherence, then this piece of knowledge has to be removed. Let us note that the notion of explanation used here will coincide with the notion of explanation in task 3.

Deliverables :


     
This page was last modified on 7 February 2011, at 07:22. This page has been accessed 1,638 times.