创建cassandra触发器期间发生java.lang.NoClassDefFoundError

创建cassandra触发器期间发生java.lang.NoClassDefFoundError,cassandra,kafka-producer-api,Cassandra,Kafka Producer Api,您好,我正在编写一个小的cassandra触发器,它在插入到某个表之后将信息发送给kafka。这是我的触发代码: public class InsertDataTrigger implements ITrigger { public Collection<Mutation> augment(Partition update) { //checking if trigger works and some debug info; SimpleD

您好,我正在编写一个小的cassandra触发器,它在插入到某个表之后将信息发送给kafka。这是我的触发代码:

public class InsertDataTrigger implements ITrigger {

    public Collection<Mutation> augment(Partition update) {

        //checking if trigger works and some debug info;
        SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy/MM/dd HH:mm:ss");
        System.out.println("Hello " + dateFormat.format(new Date()));
        System.out.println("This Insert Data Trigger");
        System.out.println("default charset " + Charset.defaultCharset());      //IMPORTANT check if it's important

        //here we're gonna build the message to kafka based on inserted data
        try {
            UnfilteredRowIterator it = update.unfilteredIterator();
            CFMetaData cfMetaData = update.metadata();

            System.out.println("PartitionKey " + new String(update.partitionKey().getKey().array()));
            System.out.println("update.metadata().clusteringColumns().toString() " + update.metadata().clusteringColumns().toString());

            while (it.hasNext()) {
                JSONObject message = new JSONObject();

                Unfiltered un = it.next();
                Clustering clt = (Clustering) un.clustering();

                message.put("partitionkey", new String(update.partitionKey().getKey().array()));

                System.out.println("clt.toString(cfMetaData) " + clt.toString(cfMetaData));
                System.out.println("clt.getRawValues() " + new String(clt.getRawValues()[0].array()));
                System.out.println("partition.columns().toString() " + update.columns().toString());

                message.put("datetime", new String(clt.getRawValues()[0].array()));

                Iterator<Cell> cells = update.getRow(clt).cells().iterator();

                while (cells.hasNext()) {
                    Cell cell = cells.next();
                    System.out.println("cell.column().name.toString() " + cell.column().name.toString());
                    System.out.println("cell.toString()" + cell.toString());
                    Double x = cell.value().getDouble();
                    System.out.println("cell.value().getDouble() " + x);
                    //if(cell.column().name.toString() == "value")
                    System.out.println(x);
                    message.put(cell.column().name.toString(), x);
                    //else
                    //   message.put(cell.column().name.toString(),cell.value().toString());
                }
                System.out.println("un.toString()" + un.toString(cfMetaData));

                if (!message.isEmpty()) {
                    System.out.println(message.toString());

                    //Sending data to kafka
                    Properties props = new Properties();
                    props.put("bootstrap.servers", "localhost:9092");
                    props.put("acks", "all");
                    props.put("retries", 0);
                    props.put("batch.size", 16384);
                    props.put("linger.ms", 1);
                    props.put("buffer.memory", 33554432);
                    props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
                    props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");

                    Producer<String, String> producer = new KafkaProducer<>(props);
                    producer.send(new ProducerRecord<>("test", message.toString()));//move topic name to some properties
                    producer.close();
                }


            }
        } catch (Exception e) {
            e.printStackTrace();
        }

        return Collections.emptyList();
    } }
公共类InsertDataTrigger实现ITrigger{
公共集合扩充(分区更新){
//检查触发器是否工作以及一些调试信息;
SimpleDataFormat dateFormat=新的SimpleDataFormat(“yyyy/MM/dd HH:MM:ss”);
System.out.println(“Hello”+dateFormat.format(newdate());
System.out.println(“此插入数据触发器”);
System.out.println(“default charset”+charset.defaultCharset());//重要检查是否重要
//在这里,我们将根据插入的数据为卡夫卡构建消息
试一试{
UnfilteredRowIterator it=update.unfilteredIterator();
CFMetaData CFMetaData=update.metadata();
System.out.println(“PartitionKey”+新字符串(update.PartitionKey().getKey().array());
System.out.println(“update.metadata().clusteringColumns().toString()”+update.metadata().clusteringColumns().toString());
while(it.hasNext()){
JSONObject消息=新的JSONObject();
未过滤的un=it.next();
集群clt=(集群)un.Clustering();
message.put(“partitionkey”,新字符串(update.partitionkey().getKey().array());
System.out.println(“clt.toString(cfMetaData)”+clt.toString(cfMetaData));
System.out.println(“clt.getRawValues()”+新字符串(clt.getRawValues()[0].array());
System.out.println(“partition.columns().toString()”+update.columns().toString());
message.put(“datetime”,新字符串(clt.getRawValues()[0].array());
迭代器单元格=update.getRow(clt.cells().Iterator();
while(cells.hasNext()){
Cell=cells.next();
System.out.println(“cell.column().name.toString()”+cell.column().name.toString());
System.out.println(“cell.toString()”+cell.toString());
Double x=cell.value().getDouble();
System.out.println(“cell.value().getDouble()”+x);
//if(cell.column().name.toString()=“value”)
系统输出println(x);
message.put(cell.column().name.toString(),x);
//否则
//message.put(cell.column().name.toString(),cell.value().toString());
}
System.out.println(“un.toString()”+un.toString(cfMetaData));
如果(!message.isEmpty()){
System.out.println(message.toString());
//向卡夫卡发送数据
Properties props=新属性();
put(“bootstrap.servers”,“localhost:9092”);
道具放置(“阿克斯”、“全部”);
道具放置(“重试”,0);
道具放置(“批量大小”,16384);
道具放置(“玲儿小姐”,1);
props.put(“buffer.memory”,33554432);
put(“key.serializer”、“org.apache.kafka.common.serialization.StringSerializer”);
put(“value.serializer”、“org.apache.kafka.common.serialization.StringSerializer”);
制作人=新卡夫卡制作人(道具);
producer.send(newproducerrecord(“test”,message.toString());//将主题名称移动到某些属性
producer.close();
}
}
}捕获(例外e){
e、 printStackTrace();
}
返回集合。emptyList();
} }
这是我的pom文件:

<?xml version="1.0" encoding="UTF-8"?>
  <project xmlns="http://maven.apache.org/POM/4.0.0"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">

   <build>
       <plugins>
           <plugin>
               <groupId>org.apache.maven.plugins</groupId>
               <artifactId>maven-compiler-plugin</artifactId>
               <version>3.1</version>
               <configuration>
                   <source>1.8</source>
                   <target>1.8</target>
               </configuration>
           </plugin>
       </plugins>
   </build>

   <modelVersion>4.0.0</modelVersion>

   <groupId>io.github.carldata</groupId>
   <artifactId>InsertDataTrigger</artifactId>
   <version>1.0</version>

   <dependencies>
       <!-- https://mvnrepository.com/artifact/org.apache.cassandra/cassandra-all -->
       <dependency>
           <groupId>org.apache.cassandra</groupId>
           <artifactId>cassandra-all</artifactId>
           <version>3.11.0</version>
       </dependency>

       <dependency>
           <groupId>org.apache.kafka</groupId>
           <artifactId>kafka-clients</artifactId>
           <version>0.11.0.0</version>
       </dependency>
   </dependencies>

</project>

org.apache.maven.plugins
maven编译器插件
3.1
1.8
1.8
4.0.0
io.github.carldata
插入数据触发器
1
org.apache.cassandra
卡桑德拉
3.11.0
org.apache.kafka
卡夫卡客户
0.11.0.0

该项目构建得很好,并创建了一个jar文件,但当我尝试在cassandra中创建触发器时,它失败了,出现了上述异常。

很可能kafka客户端jar不在cassandra lib目录中。除非你的项目包含了这种依赖性(比如构建一个fat/uber jar)


您可能会遇到卡夫卡客户端jar和卡桑德拉依赖项冲突的问题。特别是
org.xerial.snappy snappy java
有不同的版本。可能会有结果,但需要注意。如果有问题,您可以构建自己的Kafka客户端jar,并对其依赖项进行着色,这样它们就不会发生冲突。

非常感谢您,我创建了uber jar,它解决了问题。