Java 猫头鹰与隐士推理
我正在尝试使用Hermit Reasoner来检查一致性。默认情况下,Hermit reasoner不对不一致性提供任何理由/解释 已编辑版本::我目前正在尝试使用OWLReasoner,但它仍然会引发错误。Java 猫头鹰与隐士推理,java,eclipse,owl,ontology,owl-api,Java,Eclipse,Owl,Ontology,Owl Api,我正在尝试使用Hermit Reasoner来检查一致性。默认情况下,Hermit reasoner不对不一致性提供任何理由/解释 已编辑版本::我目前正在尝试使用OWLReasoner,但它仍然会引发错误。 import java.util.Set; import org.semanticweb.HermiT.Reasoner; import org.semanticweb.owl.explanation.api.Explanation; import org.semanticweb.ow
import java.util.Set;
import org.semanticweb.HermiT.Reasoner;
import org.semanticweb.owl.explanation.api.Explanation;
import org.semanticweb.owl.explanation.api.ExplanationGeneratorFactory;
import org.semanticweb.owl.explanation.api.ExplanationManager;
import org.semanticweb.owl.explanation.impl.blackbox.checker.InconsistentOntologyExplanationGeneratorFactory;
import org.semanticweb.owlapi.apibinding.OWLManager;
import org.semanticweb.owlapi.model.IRI;
import org.semanticweb.owlapi.model.OWLAxiom;
import org.semanticweb.owlapi.model.OWLClass;
import org.semanticweb.owlapi.model.OWLOntology;
import org.semanticweb.owlapi.model.OWLOntologyManager;
import org.semanticweb.owlapi.reasoner.Node;
import org.semanticweb.owlapi.reasoner.OWLReasoner;
import org.semanticweb.owl.explanation.api.ExplanationGenerator;
import org.semanticweb.owlapi.model.OWLDataFactory;
import org.semanticweb.owlapi.model.OWLNamedIndividual;
import org.semanticweb.owlapi.model.OWLOntologyCreationException;
import org.semanticweb.owlapi.reasoner.OWLReasonerFactory;
public class ConsistencyChecker {
public static void main(String[] args) throws Exception {
OWLOntologyManager m=OWLManager.createOWLOntologyManager();
OWLOntology o=m.loadOntologyFromOntologyDocument(IRI.create("http://www.cs.ox.ac.uk/isg/ontologies/UID/00793.owl"));
// Reasoner hermit=new Reasoner(o);
OWLReasoner owlreasoner=new Reasoner.ReasonerFactory().createReasoner(o);
System.out.println(owlreasoner.isConsistent());
//System.out.println(hermit.isConsistent());
//---------------------------- Copied from example---------
OWLDataFactory df = m.getOWLDataFactory();
OWLClass testClass = df.getOWLClass(IRI.create("urn:test#testclass"));
m.addAxiom(o, df.getOWLSubClassOfAxiom(testClass, df.getOWLNothing()));
OWLNamedIndividual individual = df.getOWLNamedIndividual(IRI
.create("urn:test#testindividual"));
m.addAxiom(o, df.getOWLClassAssertionAxiom(testClass, individual));
//----------------------------------------------------------
Node<OWLClass> unsatisfiableClasses = owlreasoner.getUnsatisfiableClasses();
//Node<OWLClass> unsatisfiableClasses = hermit.getUnsatisfiableClasses();
for (OWLClass owlClass : unsatisfiableClasses) {
System.out.println(owlClass.getIRI());
}
//-----------------------------
ExplanationGeneratorFactory<OWLAxiom> genFac = ExplanationManager.createExplanationGeneratorFactory((OWLReasonerFactory) owlreasoner);
ExplanationGenerator<OWLAxiom> gen = genFac.createExplanationGenerator(o);
//-------------------------
InconsistentOntologyExplanationGeneratorFactory igf = new InconsistentOntologyExplanationGeneratorFactory((OWLReasonerFactory) owlreasoner, 10000);
//InconsistentOntologyExplanationGeneratorFactory igf = new InconsistentOntologyExplanationGeneratorFactory((OWLReasonerFactory) hermit, 10000);
ExplanationGenerator<OWLAxiom> generator = igf.createExplanationGenerator(o);
OWLAxiom entailment = df.getOWLClassAssertionAxiom(df.getOWLNothing(),
individual);
//-------------
Set<Explanation<OWLAxiom>> expl = gen.getExplanations(entailment, 5);
//------------
System.out.println("Explanation "
+ generator.getExplanations(entailment, 5));
}
}
如果您能帮助我们将OWLExpression api[1]与Hermit Reasoner/OWLReasoner集成,我们将不胜感激
[1] 错误是因为您正在将
OWLReasoner
强制转换为OWLReasonerFactory
HermiT的OWLReasonerFactory
就是您在上面几行中使用的:
new Reasoner.ReasonerFactory()
//package org.semanticweb.HermiT.examples;
导入java.util.Set;
导入org.semanticweb.HermiT.Configuration;
导入org.semanticweb.HermiT.Reasoner;
导入org.semanticweb.HermiT.Reasoner.ReasonerFactory;
导入org.semanticweb.owlapi.apibinding.OWLManager;
导入org.semanticweb.owlapi.model.IRI;
导入org.semanticweb.owlapi.model.OWLAxiom;
导入org.semanticweb.owlapi.model.OWLClass;
导入org.semanticweb.owlapi.model.OWLDataFactory;
导入org.semanticweb.owlapi.model.OWLOntology;
导入org.semanticweb.owlapi.model.owletologyManager;
导入org.semanticweb.owlapi.reasoner.OWLReasoner;
导入com.clarkparsia.owlapi.explation.blackbox解释;
导入com.clarkparsia.owlapi.explaution.HSTExplanationGenerator;
公开课解释{
公共静态void main(字符串[]args)引发异常{
//首先,我们创建一个OWLOntologyManager对象
//保存本体论。
OWLOntologyManager=OWLManager.createOWLOntologyManager();
//我们将创建几个东西,因此保存数据工厂的一个实例
OWLDataFactory dataFactory=manager.getOWLDataFactory();
//现在,我们创建将从中加载本体的文件。
//在这里,本体存储在本体子文件夹的本地文件中
//示例文件夹的。
//File inputonologyfile=新文件(“examples/ontologys/pizza.owl”);
//我们使用OWLAPI加载本体。
//OWLOntology=manager.loadOntologyFromOntologyDocument(inputOntologyFile);
//我们使用owlapi加载Pizza本体。
OWLOntology=manager.loadOntologyFromOntologyDocument(IRI.create(“http://www.cs.ox.ac.uk/isg/ontologies/UID/00793.owl"));
//让我们通过断言
//不满意冰淇淋类有一些实例。
//首先,为不可满足的icecream类创建OWLClass对象的实例。
IRI冰淇淋IRI=IRI.create(“http://www.co-ode.org/ontologies/pizza/pizza.owl#IceCream");
owlclassecream=dataFactory.getOWLClass(icecreamIRI);
//现在我们可以启动并创建推理器了
//HermiT是在OWLAPI中实现的,我们需要实例化HermiT
//这是通过一个RationalFactory对象完成的。
ReasonerFactory工厂=新ReasonerFactory();
//我们不希望HermiT为不一致的本体论抛出异常,因为这样我们
//无法解释不一致性。这可以通过配置设置控制。
配置=新配置();
configuration.throwinconsistentonologyexception=false;
//工厂现在可以用来获得作为猫头鹰推理机的HermiT实例。
OWLReasoner=factory.createReasoner(本体,配置);
//让我们确认一下,冰淇淋确实不能令人满意:
System.out.println(“冰淇淋是否令人满意?”+reasoner.issatifiable(冰淇淋));
System.out.println(“计算解释…”);
//现在我们实例化解释类
Blackbox解释exp=新的Blackbox解释(本体、工厂、推理机);
hstexplanizationgenerator multexplaniator=新的hstexplanizationgenerator(exp);
//现在我们可以得到不满意的解释。
Set EXPLATIONS=multexplantator.getEXPLATIONS(冰淇淋);
//让我们把它们打印出来。每一种解释都是一套可能导致错误的公理
//不可满足性。
用于(设置说明:说明){
System.out.println(“------------------------”;
System.out.println(“导致不可满足性的公理:”);
原因(公理因果关系:解释){
系统输出打印LN(导致异常);
}
System.out.println(“------------------------”;
}
//让我们使本体不一致,以便对
//不一致性,由于我们动态地
//必须更改工厂构造函数;否则,我们无法抑制
//OWLAPI需要的不一致本体异常
//投球的理由。
//让我们首先向Unsatifiable Icecream类添加一个虚拟个体。
//这将导致不一致。
OWLAxiom ax=dataFactory.getOWLClassAssertionAxiom(冰淇淋,dataFactory.getOWLNamedIndividual(IRI.create()http://www.co-ode.org/ontologies/pizza/pizza.owl#dummyIndividual")));
addAxiom(本体论,ax);
//让我们确认本体是不一致的
reasoner=factory.createReasoner(本体,配置);
System.out.println(“更改的本体是否一致?”+reasoner.isConsistent());
//好了,我们开始吧。让我们看看为什么本体是不一致的。
System.out.println(“不一致性的计算解释…”);
工厂=新的推理机。推理机工厂(){
受保护的OWLReasoner createHermiTOWLReasoner(org.semanticweb.HermiT.Configuration配置
true
http://www.w3.org/2002/07/owl#Nothing
http://www.co-ode.org/ontologies/pizza/pizza.owl#CheeseyVegetableTopping
http://www.co-ode.org/ontologies/pizza/pizza.owl#IceCream
Exception in thread "main" java.lang.ClassCastException: org.semanticweb.HermiT.Reasoner cannot be cast to org.semanticweb.owlapi.reasoner.OWLReasonerFactory
at ConsistencyChecker.main(ConsistencyChecker.java:82)
//package org.semanticweb.HermiT.examples;
import java.util.Set;
import org.semanticweb.HermiT.Configuration;
import org.semanticweb.HermiT.Reasoner;
import org.semanticweb.HermiT.Reasoner.ReasonerFactory;
import org.semanticweb.owlapi.apibinding.OWLManager;
import org.semanticweb.owlapi.model.IRI;
import org.semanticweb.owlapi.model.OWLAxiom;
import org.semanticweb.owlapi.model.OWLClass;
import org.semanticweb.owlapi.model.OWLDataFactory;
import org.semanticweb.owlapi.model.OWLOntology;
import org.semanticweb.owlapi.model.OWLOntologyManager;
import org.semanticweb.owlapi.reasoner.OWLReasoner;
import com.clarkparsia.owlapi.explanation.BlackBoxExplanation;
import com.clarkparsia.owlapi.explanation.HSTExplanationGenerator;
public class Explanations {
public static void main(String[] args) throws Exception {
// First, we create an OWLOntologyManager object. The manager will load and
// save ontologies.
OWLOntologyManager manager=OWLManager.createOWLOntologyManager();
// We will create several things, so we save an instance of the data factory
OWLDataFactory dataFactory=manager.getOWLDataFactory();
// Now, we create the file from which the ontology will be loaded.
// Here the ontology is stored in a file locally in the ontologies subfolder
// of the examples folder.
//File inputOntologyFile = new File("examples/ontologies/pizza.owl");
// We use the OWL API to load the ontology.
//OWLOntology ontology=manager.loadOntologyFromOntologyDocument(inputOntologyFile);
// We use the OWL API to load the Pizza ontology.
OWLOntology ontology=manager.loadOntologyFromOntologyDocument(IRI.create("http://www.cs.ox.ac.uk/isg/ontologies/UID/00793.owl"));
// Lets make things worth and turn Pizza into an inconsistent ontology by asserting that the
// unsatisfiable icecream class has some instance.
// First, create an instance of the OWLClass object for the unsatisfiable icecream class.
IRI icecreamIRI=IRI.create("http://www.co-ode.org/ontologies/pizza/pizza.owl#IceCream");
OWLClass icecream=dataFactory.getOWLClass(icecreamIRI);
// Now we can start and create the reasoner. Since explanation is not natively supported by
// HermiT and is realised in the OWL API, we need to instantiate HermiT
// as an OWLReasoner. This is done via a ReasonerFactory object.
ReasonerFactory factory = new ReasonerFactory();
// We don't want HermiT to thrown an exception for inconsistent ontologies because then we
// can't explain the inconsistency. This can be controlled via a configuration setting.
Configuration configuration=new Configuration();
configuration.throwInconsistentOntologyException=false;
// The factory can now be used to obtain an instance of HermiT as an OWLReasoner.
OWLReasoner reasoner=factory.createReasoner(ontology, configuration);
// Let us confirm that icecream is indeed unsatisfiable:
System.out.println("Is icecream satisfiable? "+reasoner.isSatisfiable(icecream));
System.out.println("Computing explanations...");
// Now we instantiate the explanation classes
BlackBoxExplanation exp=new BlackBoxExplanation(ontology, factory, reasoner);
HSTExplanationGenerator multExplanator=new HSTExplanationGenerator(exp);
// Now we can get explanations for the unsatisfiability.
Set<Set<OWLAxiom>> explanations=multExplanator.getExplanations(icecream);
// Let us print them. Each explanation is one possible set of axioms that cause the
// unsatisfiability.
for (Set<OWLAxiom> explanation : explanations) {
System.out.println("------------------");
System.out.println("Axioms causing the unsatisfiability: ");
for (OWLAxiom causingAxiom : explanation) {
System.out.println(causingAxiom);
}
System.out.println("------------------");
}
// Let us make the ontology inconsistent to also get explanations for an
// inconsistency, which is slightly more involved since we dynamically
// have to change the factory constructor; otherwise, we can't suppress
// the inconsistent ontology exceptions that the OWL API requires a
// reasoner to throw.
// Let's start by adding a dummy individual to the unsatisfiable Icecream class.
// This will cause an inconsistency.
OWLAxiom ax=dataFactory.getOWLClassAssertionAxiom(icecream, dataFactory.getOWLNamedIndividual(IRI.create("http://www.co-ode.org/ontologies/pizza/pizza.owl#dummyIndividual")));
manager.addAxiom(ontology, ax);
// Let us confirm that the ontology is inconsistent
reasoner=factory.createReasoner(ontology, configuration);
System.out.println("Is the changed ontology consistent? "+reasoner.isConsistent());
// Ok, here we go. Let's see why the ontology is inconsistent.
System.out.println("Computing explanations for the inconsistency...");
factory=new Reasoner.ReasonerFactory() {
protected OWLReasoner createHermiTOWLReasoner(org.semanticweb.HermiT.Configuration configuration,OWLOntology ontology) {
// don't throw an exception since otherwise we cannot compte explanations
configuration.throwInconsistentOntologyException=false;
return new Reasoner(configuration,ontology);
}
};
exp=new BlackBoxExplanation(ontology, factory, reasoner);
multExplanator=new HSTExplanationGenerator(exp);
// Now we can get explanations for the inconsistency
explanations=multExplanator.getExplanations(dataFactory.getOWLThing());
// Let us print them. Each explanation is one possible set of axioms that cause the
// unsatisfiability.
for (Set<OWLAxiom> explanation : explanations) {
System.out.println("------------------");
System.out.println("Axioms causing the inconsistency: ");
for (OWLAxiom causingAxiom : explanation) {
System.out.println(causingAxiom);
}
System.out.println("------------------");
}
}
}