A Knowledge-Light Mechanism for Explanation in Case-Based Reasoning
Download Item:

Abstract:
Decision support systems are currently achieving higher classification accuracies by using more
complex reasoning mechanisms. Examples of such mechanisms include support vector machines
and neural networks. However in spite of these increases in accuracy many decision support
systems are not accepted by users. In domains where there is a high cost associated with incorrect
classifications, such as medical domains, users are not always willing to accept a decision support
system's classification without proper justification.
In every walk of life, from the home to the workplace, people use explanations all the time
to justify their opinions. Explanations can have many different forms depending on the context
in which they are used. Over the last few decades there has been a vast amount of research by
philosophers into the importance and the requirements of suitable explanations.
In spite of the importance of suitable explanations to justify an opinion, many decision support
systems fall sort of this requirement. Part of the reason for this is that in many types of decision
support systems it is often extremely difficult, if not impossible, to produce explanations. This
is particularly the case for black box systems such as support vector machines. However in other
systems such as rule-based systems, where the explanation can be in the form of a rule, the
explanations can often be complicated and result in confusion for users.
Alternatively Case-based Reasoning (CBR) Systems lend themselves naturally to producing
explanations. As the reasoning in CBR systems is performed on the most similar past case(s) to
a current problem, these similar cases can be used as an explanation for a classification. As these
similar cases are real past problems, they are generally easily understood by users.
It is our believe however, that in CBR systems, that there are more suitable cases to use as
an explanation than simply using the most similar cases. It is our belief that these more suitable
cases lie between the problem case and the perceived decision boundary. This results in the cases
forming an a fortiori argument. We describe a framework that we have developed for selecting
such cases.
We also believe that it is often not enough, regardless of the suitability, to just use a case as
an explanation. In our framework we included a mechanism for generating explanatory text that
can express why the case is suitable, or in some situations aspects of the case that may not be
suitable. This explanatory text can further assist users to decide if they agree with the opinion of
CBR system.
Based on the developed framework we implemented a decision support system for use in the
domain of Bronchiolitis, a viral infection that effects young children. This system was used and
evaluated in the Kern Medical Center, Bakersfield, California during their Bronchiolitis season.
Author: Doyle, Donal
Advisor:
Cunningham, PadraigType of material:
DoctoralDoctor of Philosophy (Ph.D.)
Collections:
Availability:
Full text availableKeywords:
Computer ScienceLicences: