Hi Tanel,
Thanks for your clear statement of the classical Nixon/Quaker example,
below. I have put a version of it on our community demo ID [2] , so
that anyone can run it, or change it.
Yes, logic programming does mix declarative and procedural
concepts.
Our system uses a different inference method (based on [1]) from, say,
Prolog, so that non-programmer authors have much less to worry about
procedurally. In other words, the system is more declarative than
one would expect from a logic programming language.
I was really hoping for more detail on your statement
However, these kinds of solutions are never universal nor
domain-independent. The solutions become also more and more
complex as they target wider and wider applicability. The
same, seemingly straightforward negation-as-failure brings a huge
amount of complexity into play.
Perhaps what you are partly getting at is that controlled
vocabulary systems are not domain-independent ? However,
our system supports an open vocabulary, in a lightweight manner
that does not require dictionary or grammar maintenance from the
author/user [3,4]. The examples provided include reasoning over a
gene ontology, phone billing, supply chain, reasoning over RDF, database
security, etc... So, it's hard to detect any domain-dependence in
the vocabulary part of the approach.
Thanks in advance for your further thoughts about
this, -- Adrian
[1] Backchain Iteration: Towards a Practical Inference Method that is
Simple Enough to be Proved Terminating, Sound and Complete. Journal of
Automated Reasoning, 11:1-22
[2] The demo ID of the Internet Business Logic system at
www.reengineeringllc.com
.
Anyone can run and change the examples there, using a browser.
[3] Frequently asked question number 8 at www.reengineeringllc.com
[4] "Semantic Web Presentation", at the above url
At 07:42 PM 9/24/04 +0300, you wrote:
Adrian Walker wrote:
[Adrian replies]I am not really sure what kind of an example you expect? I'll just
Can you show an example? Or better yet, write it and run it in the online system [1] ?
bring the classical ones: tell me if you meant something different.
One nice classical example is the quaker/respublican example. Let us have
quaker(nixon).
respublican(nixon).
quaker(x) & notknown not pacifist(x) => pacifist(x).
respublican(x) & notknown pacifist(x) => not pacifist(x).
where "notknown" is the negation-as-failure operator and "not" is
the real, logical "not".
What follows: "pacifist(nixon)" or "not pacifist(nixon)"?
To solve this in one concrete way, you need to add an order
(or priority) to rules. Now, adding priorities is a nice idea,
but goes even further away from FOL, and the semantics
of the system becomes really unclear. Every solution brings
more complexities with it.
Another classical observation is that whenever you have a rule
notknown P(x) => Q(x)
you need to be able to prove that P(a) cannot be
derived, in order to derive Q(a). As we know,
full FOL is undecidable and there exists no
algorithm which could always find out that P(a)
is not derivable: sometimes you may get lucky
and find it out (if you have a good algorithm and
a simple problem), but not always.
Hence you may not be able to derive Q(a) even if
P(a) really is not derivable.
Now, even in a lucky case (simple problem and a good
algorithm) it typically takes huge resources to show
that P(a) cannot be derived: much more than just deriving
something. Any application of negation-as-failure tends
to be be very expensive.
You are of course right in the sense that clever
encoding will give you cases where such problems
can be solved fast: that is the prolog approach,
of course. But this is really procedural programming,
not using logic declaratively.
Tanel