Apache spark 在CentOS7上构建Spark 1.6.2时,未找到获取SparkFlumeProtocol和EventBatch的错误
我试图在Apache spark 在CentOS7上构建Spark 1.6.2时,未找到获取SparkFlumeProtocol和EventBatch的错误,apache-spark,build,centos7,Apache Spark,Build,Centos7,我试图在CentOS7上构建Spark 1.6.2,但遇到以下错误: [error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:45: not found: type SparkFlumeProtocol [error] val transactionTimeout: Int
CentOS7
上构建Spark 1.6.2
,但遇到以下错误:
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:45: not found: type SparkFlumeProtocol
[error] val transactionTimeout: Int, val backOffInterval: Int) extends SparkFlumeProtocol with Logging {
[error] ^
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:70: not found: type EventBatch
[error] override def getEventBatch(n: Int): EventBatch = {
[error] ^
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/TransactionProcessor.scala:80: not found: type EventBatch
[error] def getEventBatch: EventBatch = {
[error] ^
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSinkUtils.scala:25: not found: type EventBatch
[error] def isErrorBatch(batch: EventBatch): Boolean = {
[error] ^
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:85: not found: type EventBatch
[error] new EventBatch("Spark sink has been stopped!", "", java.util.Collections.emptyList())
[error] ^
[warn] Class org.jboss.netty.channel.ChannelFactory not found - continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelFactory not found - continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelPipelineFactory not found - continuing with a stub.
[warn] Class org.jboss.netty.handler.execution.ExecutionHandler not found - continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelFactory not found - continuing with a stub.
[warn] Class org.jboss.netty.handler.execution.ExecutionHandler not found - continuing with a stub.
[warn] Class org.jboss.netty.channel.group.ChannelGroup not found - continuing with a stub.
[error] /home/pateln16/spark-1.6.2/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSink.scala:86: not found: type SparkFlumeProtocol
[error] val responder = new SpecificResponder(classOf[SparkFlumeProtocol], handler.get)
我在Spark 2.0.0上遇到了同样的问题。我认为原因是文件“external\flume sink\src\main\avro\sparkflume.avdl”没有很好地编译 该问题可通过以下方式解决:
- 下载ApacheAvro
我将所有jar文件下载到文件夹“C:\Downloads\avro”中 - 转到文件夹“external\flume sink\src\main\avro”
- 将sparkflume.avdl编译为java文件
java-jar C:\Downloads\avro\avro-tools-1.8.1.jar idl sparkflume.avdl>sparkflume.avpr
java-jar C:\Downloads\avro\avro-tools-1.8.1.jar compile-string协议sparkflume.avpr..\scala - 重新编译您的项目
- 下载ApacheAvro
我将所有jar文件下载到文件夹“C:\Downloads\avro”中 - 转到文件夹“external\flume sink\src\main\avro”
- 将sparkflume.avdl编译为java文件
java-jar C:\Downloads\avro\avro-tools-1.8.1.jar idl sparkflume.avdl>sparkflume.avpr
java-jar C:\Downloads\avro\avro-tools-1.8.1.jar compile-string协议sparkflume.avpr..\scala - 重新编译您的项目