Python Spark与卡夫卡直接流媒体
我正在尝试为卡夫卡实现Spark DirectStream,下面是代码Python Spark与卡夫卡直接流媒体,python,apache-spark,apache-kafka,Python,Apache Spark,Apache Kafka,我正在尝试为卡夫卡实现Spark DirectStream,下面是代码 con = SparkConf().setMaster("local[2]").setAppName("Streamer") sc = SparkContext(conf=con) ssc = StreamingContext(sc, 10) ssc.checkpoint("checkpoint") #consumer(ssc) kstream = KafkaUtils.createDirectStream(ssc,
con = SparkConf().setMaster("local[2]").setAppName("Streamer")
sc = SparkContext(conf=con)
ssc = StreamingContext(sc, 10)
ssc.checkpoint("checkpoint")
#consumer(ssc)
kstream = KafkaUtils.createDirectStream(ssc, kafkaParams = {"bootstrap_servers": 'localhost:9092'},topics = ['twitter'])
tweets = kstream.map(lambda line: line.decode('ascii'))
text = tweets.map(lambda line: line.split(','))
print text.collect()
当我运行它时,会出现以下错误。怎么了
Traceback (most recent call last):
File "/Users/vinayaka/Desktop/BigData/HW3/SentimentAnalyser/consumer.py", line 20, in <module>
main()
File "/Users/vinayaka/Desktop/BigData/HW3/SentimentAnalyser/consumer.py", line 14, in main
kstream = KafkaUtils.createDirectStream(ssc, kafkaParams = {"bootstrap_servers": 'localhost:9092'},topics = ['twitter'])
File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/lib/pyspark.zip/pyspark/streaming/kafka.py", line 130, in createDirectStream
File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1133, in __call__
File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o23.createDirectStreamWithoutMessageHandler.
: org.apache.spark.SparkException: Must specify metadata.broker.list or bootstrap.servers
at org.apache.spark.streaming.kafka.KafkaCluster$SimpleConsumerConfig$$anonfun$9.apply(KafkaCluster.scala:417)
at org.apache.spark.streaming.kafka.KafkaCluster$SimpleConsumerConfig$$anonfun$9.apply(KafkaCluster.scala:417)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.streaming.kafka.KafkaCluster$SimpleConsumerConfig$.apply(KafkaCluster.scala:417)
at org.apache.spark.streaming.kafka.KafkaCluster.config(KafkaCluster.scala:53)
at org.apache.spark.streaming.kafka.KafkaCluster.getPartitionMetadata(KafkaCluster.scala:130)
at org.apache.spark.streaming.kafka.KafkaCluster.getPartitions(KafkaCluster.scala:119)
at org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffsets(KafkaUtils.scala:211)
at org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper.createDirectStream(KafkaUtils.scala:720)
at org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper.createDirectStreamWithoutMessageHandler(KafkaUtils.scala:688)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:280)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:745)
回溯(最近一次呼叫最后一次):
文件“/Users/vinayaka/Desktop/BigData/HW3/consumer/consumer.py”,第20行,在
main()
文件“/Users/vinayaka/Desktop/BigData/HW3/consumer/consumer.py”,主文件第14行
kstream=KafkaUtils.createDirectStream(ssc,kafkaParams={“bootstrap_服务器”:“localhost:9092”},topics=['twitter']))
createDirectStream中的文件“/usr/local/ceral/apache spark/2.1.0/libexec/python/lib/pyspark.zip/pyspark/streaming/kafka.py”,第130行
文件“/usr/local/cillar/apache spark/2.1.0/libexec/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py”,第1133行,在__
文件“/usr/local/ceral/apache spark/2.1.0/libexec/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py”,第319行,在get_return_值中
py4j.protocol.Py4JJavaError:调用o23.createDirectStreamWithoutMessageHandler时出错。
:org.apache.spark.SparkException:必须指定metadata.broker.list或bootstrap.servers
位于org.apache.spark.streaming.kafka.KafkaCluster$SimpleConsumerConfig$$anonfun$9.apply(KafkaCluster.scala:417)
位于org.apache.spark.streaming.kafka.KafkaCluster$SimpleConsumerConfig$$anonfun$9.apply(KafkaCluster.scala:417)
位于scala.Option.getOrElse(Option.scala:121)
在org.apache.spark.streaming.kafka.KafkaCluster$SimpleConsumerConfig$.apply上(KafkaCluster.scala:417)
位于org.apache.spark.streaming.kafka.KafkaCluster.config(KafkaCluster.scala:53)
位于org.apache.spark.streaming.kafka.KafkaCluster.getPartitionMetadata(KafkaCluster.scala:130)
位于org.apache.spark.streaming.kafka.KafkaCluster.getPartitions(KafkaCluster.scala:119)
在org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffset上(KafkaUtils.scala:211)
在org.apache.spark.streaming.kafka.kafkautillspythonhelper.createDirectStream(KafkaUtils.scala:720)
在org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper.createDirectStreamWithoutMessageHandler(KafkaUtils.scala:688)上
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:498)
位于py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
位于py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
在py4j.Gateway.invoke处(Gateway.java:280)
位于py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
在py4j.commands.CallCommand.execute(CallCommand.java:79)
在py4j.GatewayConnection.run处(GatewayConnection.java:214)
运行(Thread.java:745)
您确定这是引导服务器吗?我认为应该是bootstrap.servers
kstream=KafkaUtils.createDirectStream(ssc,['twitter'],{“bootstrap.servers”:'localhost:9092'})
您可以查看更多详细信息。确保转到python选项卡。我尝试了kafkaParams={“bootstrap.servers”:“localhost:9092'}和kafkaParams={“metadata.broker.list”:“localhost:9092'}。仍然出现相同的错误。是否尝试过kstream=KafkaUtils.createDirectStream(ssc,['twitter'],{“bootstrap.servers”:'localhost:9092'})
?我知道它们是一样的,只是以防万一