Java Can';从Springboot web应用程序启动时,不要让spark cassandra连接器工作

Java Can';从Springboot web应用程序启动时,不要让spark cassandra连接器工作,java,spring,spring-boot,apache-spark,cassandra,Java,Spring,Spring Boot,Apache Spark,Cassandra,我正在尝试启动一个Spark作业,它使用Spring Boot应用程序中的Spark cassandra连接器作为驱动程序。我可以从Spring Boot应用程序中正确完成Spark作业,如“SparkKPI”,但运行任何使用Spark cassandra连接器的作业都无法连接到cassandra服务器 我能够直接从命令行运行这些有问题的作业,因此我相信Cassandra服务器的配置是正确的。似乎与Spring Boot存在某种冲突,但我还无法确定根本原因 以下是我运行SparkPi的日志: 2

我正在尝试启动一个Spark作业,它使用Spring Boot应用程序中的Spark cassandra连接器作为驱动程序。我可以从Spring Boot应用程序中正确完成Spark作业,如“SparkKPI”,但运行任何使用Spark cassandra连接器的作业都无法连接到cassandra服务器

我能够直接从命令行运行这些有问题的作业,因此我相信Cassandra服务器的配置是正确的。似乎与Spring Boot存在某种冲突,但我还无法确定根本原因

以下是我运行SparkPi的日志:

2019-12-20 14:44:18.472  INFO 1 --- [io-25001-exec-4] org.apache.spark.SparkContext            : Starting job: reduce at SparkPi.java:26
2019-12-20 14:44:18.498  INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler  : Got job 0 (reduce at SparkPi.java:26) with 10 output partitions
2019-12-20 14:44:18.498  INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler  : Final stage: ResultStage 0 (reduce at SparkPi.java:26)
2019-12-20 14:44:18.498  INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler  : Parents of final stage: List()
2019-12-20 14:44:18.500  INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler  : Missing parents: List()
2019-12-20 14:44:18.505  INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler  : Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.java:21), which has no missing parents
2019-12-20 14:44:18.601  INFO 1 --- [uler-event-loop] o.a.spark.storage.memory.MemoryStore     : Block broadcast_0 stored as values in memory (estimated size 3.1 KB, free 93.3 MB)
2019-12-20 14:44:18.640  INFO 1 --- [uler-event-loop] o.a.spark.storage.memory.MemoryStore     : Block broadcast_0_piece0 stored as bytes in memory (estimated size 1927.0 B, free 93.3 MB)
2019-12-20 14:44:18.642  INFO 1 --- [er-event-loop-3] o.apache.spark.storage.BlockManagerInfo  : Added broadcast_0_piece0 in memory on 0da01ab67fc0:45075 (size: 1927.0 B, free: 93.3 MB)
2019-12-20 14:44:18.645  INFO 1 --- [uler-event-loop] org.apache.spark.SparkContext            : Created broadcast 0 from broadcast at DAGScheduler.scala:1161
2019-12-20 14:44:18.658  INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler  : Submitting 10 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.java:21) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))
2019-12-20 14:44:18.659  INFO 1 --- [uler-event-loop] o.a.spark.scheduler.TaskSchedulerImpl    : Adding task set 0.0 with 10 tasks
2019-12-20 14:44:18.805  WARN 1 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Stage 0 contains a task of very large size (984 KB). The maximum recommended task size is 100 KB.
2019-12-20 14:44:18.807  INFO 1 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Starting task 0.0 in stage 0.0 (TID 0, 172.20.57.64, executor 1, partition 0, PROCESS_LOCAL, 1007867 bytes)
2019-12-20 14:44:18.868  INFO 1 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Starting task 1.0 in stage 0.0 (TID 1, 172.20.57.62, executor 0, partition 1, PROCESS_LOCAL, 1007872 bytes)
2019-12-20 14:44:18.923  INFO 1 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Starting task 2.0 in stage 0.0 (TID 2, 172.20.57.64, executor 1, partition 2, PROCESS_LOCAL, 1007872 bytes)
2019-12-20 14:44:18.955  INFO 1 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Starting task 3.0 in stage 0.0 (TID 3, 172.20.57.62, executor 0, partition 3, PROCESS_LOCAL, 1007872 bytes)
2019-12-20 14:44:18.986  INFO 1 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Starting task 4.0 in stage 0.0 (TID 4, 172.20.57.64, executor 1, partition 4, PROCESS_LOCAL, 1007872 bytes)
2019-12-20 14:44:19.016  INFO 1 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Starting task 5.0 in stage 0.0 (TID 5, 172.20.57.62, executor 0, partition 5, PROCESS_LOCAL, 1007872 bytes)
2019-12-20 14:44:19.046  INFO 1 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Starting task 6.0 in stage 0.0 (TID 6, 172.20.57.64, executor 1, partition 6, PROCESS_LOCAL, 1007872 bytes)
2019-12-20 14:44:19.086  INFO 1 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Starting task 7.0 in stage 0.0 (TID 7, 172.20.57.62, executor 0, partition 7, PROCESS_LOCAL, 1007872 bytes)
2019-12-20 14:44:21.923  INFO 1 --- [er-event-loop-0] o.apache.spark.storage.BlockManagerInfo  : Added broadcast_0_piece0 in memory on 172.20.57.62:46673 (size: 1927.0 B, free: 366.3 MB)
2019-12-20 14:44:22.220  INFO 1 --- [er-event-loop-0] o.apache.spark.storage.BlockManagerInfo  : Added broadcast_0_piece0 in memory on 172.20.57.64:45503 (size: 1927.0 B, free: 366.3 MB)
2019-12-20 14:44:22.279  INFO 1 --- [er-event-loop-1] o.apache.spark.scheduler.TaskSetManager  : Starting task 8.0 in stage 0.0 (TID 8, 172.20.57.62, executor 0, partition 8, PROCESS_LOCAL, 1007872 bytes)
2019-12-20 14:44:22.282  INFO 1 --- [result-getter-0] o.apache.spark.scheduler.TaskSetManager  : Finished task 5.0 in stage 0.0 (TID 5) in 3294 ms on 172.20.57.62 (executor 0) (1/10)
2019-12-20 14:44:22.317  INFO 1 --- [er-event-loop-1] o.apache.spark.scheduler.TaskSetManager  : Starting task 9.0 in stage 0.0 (TID 9, 172.20.57.62, executor 0, partition 9, PROCESS_LOCAL, 1007872 bytes)
2019-12-20 14:44:22.317  INFO 1 --- [result-getter-1] o.apache.spark.scheduler.TaskSetManager  : Finished task 3.0 in stage 0.0 (TID 3) in 3393 ms on 172.20.57.62 (executor 0) (2/10)
2019-12-20 14:44:22.327  INFO 1 --- [result-getter-2] o.apache.spark.scheduler.TaskSetManager  : Finished task 1.0 in stage 0.0 (TID 1) in 3518 ms on 172.20.57.62 (executor 0) (3/10)
2019-12-20 14:44:22.328  INFO 1 --- [result-getter-3] o.apache.spark.scheduler.TaskSetManager  : Finished task 7.0 in stage 0.0 (TID 7) in 3281 ms on 172.20.57.62 (executor 0) (4/10)
2019-12-20 14:44:22.392  INFO 1 --- [result-getter-0] o.apache.spark.scheduler.TaskSetManager  : Finished task 8.0 in stage 0.0 (TID 8) in 181 ms on 172.20.57.62 (executor 0) (5/10)
2019-12-20 14:44:22.417  INFO 1 --- [result-getter-1] o.apache.spark.scheduler.TaskSetManager  : Finished task 9.0 in stage 0.0 (TID 9) in 130 ms on 172.20.57.62 (executor 0) (6/10)
2019-12-20 14:44:22.548  INFO 1 --- [result-getter-2] o.apache.spark.scheduler.TaskSetManager  : Finished task 0.0 in stage 0.0 (TID 0) in 3868 ms on 172.20.57.64 (executor 1) (7/10)
2019-12-20 14:44:22.548  INFO 1 --- [result-getter-3] o.apache.spark.scheduler.TaskSetManager  : Finished task 2.0 in stage 0.0 (TID 2) in 3679 ms on 172.20.57.64 (executor 1) (8/10)
2019-12-20 14:44:22.551  INFO 1 --- [result-getter-0] o.apache.spark.scheduler.TaskSetManager  : Finished task 4.0 in stage 0.0 (TID 4) in 3595 ms on 172.20.57.64 (executor 1) (9/10)
2019-12-20 14:44:22.552  INFO 1 --- [result-getter-1] o.apache.spark.scheduler.TaskSetManager  : Finished task 6.0 in stage 0.0 (TID 6) in 3535 ms on 172.20.57.64 (executor 1) (10/10)
2019-12-20 14:44:22.553  INFO 1 --- [result-getter-1] o.a.spark.scheduler.TaskSchedulerImpl    : Removed TaskSet 0.0, whose tasks have all completed, from pool 
2019-12-20 14:44:22.556  INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler  : ResultStage 0 (reduce at SparkPi.java:26) finished in 4.029 s
2019-12-20 14:44:22.561  INFO 1 --- [io-25001-exec-4] org.apache.spark.scheduler.DAGScheduler  : Job 0 finished: reduce at SparkPi.java:26, took 4.088867 s
Pi is roughly 3.142364
以下是运行使用spark cassandra连接器的作业的日志:

2019-12-20 14:53:43.303  INFO 1 --- [io-25001-exec-7] com.datastax.driver.core.ClockFactory    : Using native clock to generate timestamps.
2019-12-20 14:53:43.328  INFO 1 --- [io-25001-exec-7] com.datastax.driver.core.NettyUtil       : Found Netty's native epoll transport in the classpath, using it
12-20 14:53:43.376 ERROR 1 --- [io-25001-exec-7] o.a.c.c.C.[.[.[/].[dispatcherServlet]    : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is java.lang.NullPointerException] with root cause
java.lang.NullPointerException: null
    at com.datastax.driver.core.Cluster$Manager.close(Cluster.java:1627) ~[spark-cassandra-connector_2.11-2.4.1.jar!/:2.4.1]
    at com.datastax.driver.core.Cluster$Manager.access$200(Cluster.java:1318) ~[spark-cassandra-connector_2.11-2.4.1.jar!/:2.4.1]
    at com.datastax.driver.core.Cluster.closeAsync(Cluster.java:566) ~[spark-cassandra-connector_2.11-2.4.1.jar!/:2.4.1]
    at com.datastax.driver.core.Cluster.close(Cluster.java:578) ~[spark-cassandra-connector_2.11-2.4.1.jar!/:2.4.1]
    at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:167) ~[spark-cassandra-connector_2.11-2.4.1.jar!/:2.4.1]
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154) ~[spark-cassandra-connector_2.11-2.4.1.jar!/:2.4.1]
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154) ~[spark-cassandra-connector_2.11-2.4.1.jar!/:2.4.1]
    at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:32) ~[spark-cassandra-connector_2.11-2.4.1.jar!/:2.4.1]
    at com.datastax.spark.connector.cql.RefCountedCache.syncAcquire(RefCountedCache.scala:69) ~[spark-cassandra-connector_2.11-2.4.1.jar!/:2.4.1]
    at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:57) ~[spark-cassandra-connector_2.11-2.4.1.jar!/:2.4.1]
    at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:79) ~[spark-cassandra-connector_2.11-2.4.1.jar!/:2.4.1]
    ...
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_212]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_212]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_212]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_212]
    at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:190) ~[spring-web-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:138) ~[spring-web-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:104) ~[spring-webmvc-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:892) ~[spring-webmvc-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:797) ~[spring-webmvc-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) ~[spring-webmvc-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1039) ~[spring-webmvc-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:942) ~[spring-webmvc-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1005) ~[spring-webmvc-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:897) ~[spring-webmvc-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:645) ~[javax.servlet-api-4.0.1.jar!/:4.0.1]
    at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882) ~[spring-webmvc-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:750) ~[javax.servlet-api-4.0.1.jar!/:4.0.1]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53) ~[tomcat-embed-websocket-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.springframework.boot.actuate.web.trace.servlet.HttpTraceFilter.doFilterInternal(HttpTraceFilter.java:90) ~[spring-boot-actuator-2.1.5.RELEASE.jar!/:2.1.5.RELEASE]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99) ~[spring-web-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:92) ~[spring-web-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93) ~[spring-web-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.filterAndRecordMetrics(WebMvcMetricsFilter.java:117) ~[spring-boot-actuator-2.1.5.RELEASE.jar!/:2.1.5.RELEASE]
    at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.doFilterInternal(WebMvcMetricsFilter.java:106) ~[spring-boot-actuator-2.1.5.RELEASE.jar!/:2.1.5.RELEASE]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200) ~[spring-web-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.1.7.RELEASE.jar!/:5.1.7.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:200) ~[tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96) [tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:490) [tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:139) [tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92) [tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74) [tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:343) [tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:408) [tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66) [tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:836) [tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1747) [tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) [tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_212]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_212]
    at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-9.0.19.jar!/:9.0.19]
    at java.lang.Thread.run(Thread.java:748) [na:1.8.0_212]
我认为空指针错误是由于如何处理来自com.datasax.spark.connector.cql.CassandraConnector$.com$datasax$spark$connector$cql$CassandraConnector$$createSession的错误,而不是来自调用本身

以下是导致问题的代码片段:

CassandraConnector connector = CassandraConnector.apply(sparkSession.sparkContext().conf());
Session session = connector.openSession();
我正在使用:spark-cassandra-connector_2.11:2.4.1

以下是我的会话初始化:

   SparkSession sparkSession = SparkSession
            .builder()
            .appName(appName)
            .config("spark.master", sparkMasterURL)
            .config("spark.cores.max", maxCores)
            .config("spark.jars", jarFilename)
            .config("spark.cassandra.connection.keep_alive_ms", "100000")
            .config("spark.cassandra.connection.host", cassandraHost)
            .config("spark.cassandra.read.timeout_ms", "360000000")
            .getOrCreate();

有什么想法吗?

为什么需要直接获取
Session
对象?什么是Spark特有的东西?Alex,你是对的,我实际上不需要直接访问会话。然而,即使我跳过了这一步,我也会在以后的第一次Cassandra操作发生时看到同样的错误。您需要提供更多的上下文-如何建立SparkSession、什么版本的Connector、什么版本的Spark、,等等。如果没有其他信息,很难说我正在使用spark-cassandra-connector_2.11:2.4.1。以下是我的初始化:SparkSession SparkSession=SparkSession.builder().appName(appName.config(“spark.master”,sparkMasterURL.config(“spark.cores.max”,maxCores).config(“spark.jars”,jarFilename).config(“spark.cassandra.connection.keep_alive_ms”,“100000”).config(“spark.cassandra.connection.host”,cassandraHost.config(“spark.cassandra.read.timeout_ms”,“360000000”).getOrCreate();