Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/visual-studio-2010/4.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark MapR流与PySpark_Apache Spark_Pyspark_Apache Kafka_Mapr - Fatal编程技术网

Apache spark MapR流与PySpark

Apache spark MapR流与PySpark,apache-spark,pyspark,apache-kafka,mapr,Apache Spark,Pyspark,Apache Kafka,Mapr,PySpark是否对MapR流起作用(兼容) 有没有示例代码 我试过了,但总是有例外 strLoc = '/Path1:Stream1' protocol = 'file://' if ( strLoc.startswith('/') or strLoc.startswith('\\') ) else '' from pyspark.streaming.kafka import *; from pyspark import StorageLevel; APA = KafkaUtils

PySpark是否对MapR流起作用(兼容)

有没有示例代码

我试过了,但总是有例外

strLoc   = '/Path1:Stream1'
protocol = 'file://' if  ( strLoc.startswith('/') or strLoc.startswith('\\') ) else '' 
from pyspark.streaming.kafka import  *;
from pyspark import  StorageLevel;
APA = KafkaUtils.createDirectStream(ssc, [strLoc], kafkaParams={ \
    "oracle.odi.prefer.dataserver.packages" : "" \
   ,"key.deserializer" : "org.apache.kafka.common.serialization.StringDeserializer" \
   ,"value.deserializer" : "org.apache.kafka.common.serialization.ByteArrayDeserializer" \
   ,"zookeeper.connect" : "maprdemo:5181" \
   ,"metadata.broker.list" : "this.will.be.ignored:9092"
   ,"group.id" : "New_Mapping_2_Physical"}, fromOffsets=None, messageHandler=None)


Traceback (most recent call last):
  File "/tmp/New_Mapping_2_Physical.py", line 77, in <module>
    ,"group.id" : "New_Mapping_2_Physical"}, fromOffsets=None, messageHandler=None)
  File "/opt/mapr/spark/spark-1.6.1/python/lib/pyspark.zip/pyspark/streaming/kafka.py", line 152, in createDirectStream
py4j.protocol.Py4JJavaError: An error occurred while calling o58.createDirectStreamWithoutMessageHandler.
: org.apache.spark.SparkException: java.nio.channels.ClosedChannelException
    at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
    at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
    at scala.util.Either.fold(Either.scala:97)
    at org.apache.spark.streaming.kafka.KafkaCluster$.checkErrors(KafkaCluster.scala:365)
    at org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffsets(KafkaUtils.scala:222)
    at org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper.createDirectStream(KafkaUtils.scala:720)
    at org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper.createDirectStreamWithoutMessageHandler(KafkaUtils.scala:688)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
    at py4j.Gateway.invoke(Gateway.java:259)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:209)
    at java.lang.Thread.run(Thread.java:745)
strLoc='/Path1:Stream1'
协议='file://'if(strLoc.startswith('/')或strLoc.startswith('\\'))else''
从pyspark.streaming.kafka导入*;
从pyspark导入存储级别;
APA=KafkaUtils.createDirectStream(ssc[strLoc],kafkaParams={\
“oracle.odi.Preference.dataserver.packages”:”\
,“key.deserializer”:“org.apache.kafka.common.serialization.StringDeserializer”\
,“value.deserializer”:“org.apache.kafka.common.serialization.ByteArrayDeserializer”\
,“zookeeper.connect”:“maprdemo:5181”\
,“metadata.broker.list”:“this.will.be.ignored:9092”
,“group.id”:“New\u Mapping\u 2\u Physical”},fromOffset=None,messageHandler=None)
回溯(最近一次呼叫最后一次):
文件“/tmp/New_Mapping_2_Physical.py”,第77行,在
,“group.id”:“New\u Mapping\u 2\u Physical”},fromOffset=None,messageHandler=None)
createDirectStream中的文件“/opt/mapr/spark/spark-1.6.1/python/lib/pyspark.zip/pyspark/streaming/kafka.py”,第152行
py4j.protocol.Py4JJavaError:调用o58.createDirectStreamWithoutMessageHandler时出错。
:org.apache.spark.sparkeexception:java.nio.channels.ClosedChannelException
位于org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
位于org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
在scala.util.other.fold处(other.scala:97)
在org.apache.spark.streaming.kafka.KafkaCluster$.checkErrors上(KafkaCluster.scala:365)
在org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffset上(KafkaUtils.scala:222)
在org.apache.spark.streaming.kafka.kafkautillspythonhelper.createDirectStream(KafkaUtils.scala:720)
在org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper.createDirectStreamWithoutMessageHandler(KafkaUtils.scala:688)上
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:498)
位于py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
位于py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
在py4j.Gateway.invoke处(Gateway.java:259)
位于py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
在py4j.commands.CallCommand.execute(CallCommand.java:79)
在py4j.GatewayConnection.run处(GatewayConnection.java:209)
运行(Thread.java:745)

在Scala上,它似乎可以正常工作,但在PySpark上却不行。

我下载了最新版本,它解决了这个问题

我检查了pyspark kafka.py,发现它已更新。我使用的标签是1605,现在是1611