Java 风暴示例字数错误InvalidTopologyException

Java 风暴示例字数错误InvalidTopologyException,java,apache,apache-storm,Java,Apache,Apache Storm,我在编程和风暴方面有点呆头呆脑。我有一本书中的例子:“开始使用ApacheStrom”。我使用的是Storm 1.1.0和jdk 1.8。当我尝试在eclipse neon中或通过命令“storm jar…”运行代码时,我会出现以下错误: [main] ERROR o.a.s.s.o.a.z.s.NIOServerCnxnFactory - Thread Thread[main,5,main] died org.apache.storm.generated.InvalidTopologyExce

我在编程和风暴方面有点呆头呆脑。我有一本书中的例子:“开始使用ApacheStrom”。我使用的是Storm 1.1.0和jdk 1.8。当我尝试在eclipse neon中或通过命令“storm jar…”运行代码时,我会出现以下错误:

[main] ERROR o.a.s.s.o.a.z.s.NIOServerCnxnFactory - Thread Thread[main,5,main] died org.apache.storm.generated.InvalidTopologyException:null
有人知道这个错误的原因吗?我该如何修复它

我写的代码如下:

喷口:

public  class WordReader implements IRichSpout{

TopologyContext context;
SpoutOutputCollector collector;
FileReader filereader;

private boolean completed = false;

public void ack(Object msgId){
    System.out.println("OK: "+msgId);
}

public void fail(Object msgId){
    System.out.println("FAIL: "+msgId);
}

public void nextTuple(){
    if (completed){
        try {
            Thread.sleep(1000);
        } catch (Exception e) {
            //do nothing
        }
        //it should return to function
        return;
    }
    String str;
    try {
        while ((str = reader.readLine()) != null){
            this.collector.emit(new Values(str),str);
        }
    }catch (Exception e) {
            throw new RuntimeException("Error reading tuple",e);
        } finally {
            completed = true;
        }
    }
public void open(Map conf, TopologyContext context,
        SpoutOutputCollector collector){
    try {
        this.context = context;

        this.filereader = new FileReader(conf.get("words").toString());
    }catch(FileNotFoundException e) {
        throw new RuntimeException("Error!");
    }
    this.collector = collector;
}
public void declareOutputFileds(OutputFieldsDeclarer declarer)
{
    declarer.declare(new Fields("line"));
}

public void close() {
    // TODO Auto-generated method stub

}

public void activate() {
    // TODO Auto-generated method stub

}

public void deactivate() {
    // TODO Auto-generated method stub

}

public void declareOutputFields(OutputFieldsDeclarer declarer) {
    // TODO Auto-generated method stub

}

public Map<String, Object> getComponentConfiguration() {
    // TODO Auto-generated method stub
    return null;
}

t

在构建拓扑结构时,
wordnormalizer
中有一个输入错误。它应该连接到
字计数器

builder.setBolt("word-normalizer", new WordNormalizer()).shuffleGrouping("word-reader");
builder.setBolt("word-counter", new WordCounter()).shuffleGrouping("word-normalizer");

我照你说的做了,但根本不管用。我再次得到InvalidTopologyException错误。
String name;
Integer id;
Map<String, Integer> counters;
private OutputCollector collector;


public void execute(Tuple input) {
    String str = input.getString(0);
    if(!counters.containsKey(str)){
        counters.put(str, 1);
    }else{
        Integer c = counters.get(str) + 1;
        counters.put(str, c);
    }
    collector.ack(input);
    }
public void prepare(Map conf, TopologyContext context, OutputCollector collector)
{
    this.counters = new HashMap<String, Integer>();
    this.collector = collector;
    this.name = context.getThisComponentId();
    this.id = context.getThisTaskId();
}

public void cleanup() {
    System.out.println("-- Word Counter ["+name+"-"+id+"] --");
    for(Map.Entry<String, Integer> entry : counters.entrySet()){
        System.out.println(entry.getKey()+": "+entry.getValue());
        }
    }
public void declareOutputFields(OutputFieldsDeclarer declarer) {
    // TODO Auto-generated method stub

}
public Map<String, Object> getComponentConfiguration() {
    // TODO Auto-generated method stub
    return null;
}}
public static void main(String[] args) {
    TopologyBuilder builder = new TopologyBuilder();
    builder.setSpout("word-reader", new WordReader());
    builder.setBolt("word-normalizer", new WordNormalizer()).shuffleGrouping("word-reader");
    builder.setBolt("word-counter", new WordCounter()).shuffleGrouping("word-normalizer");
    Config conf = new Config();
    conf.put(Config.TOPOLOGY_MAX_SPOUT_PENDING, 1);
    conf.put("word", 0);
    conf.setDebug(true);
    LocalCluster cluster = new LocalCluster();
    cluster.submitTopology("word", conf, builder.createTopology());
    try{
        Thread.sleep(2000);
    } catch (Exception e) {
        // TODO: handle exception
    }
    cluster.shutdown();
}}
builder.setBolt("word-normalizer", new WordNormalizer()).shuffleGrouping("word-reader");
builder.setBolt("word-counter", new WordCounter()).shuffleGrouping("word-normalizer");