Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/angularjs/24.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Angularjs 使用spark和kafka时使用齐柏林飞艇z.put函数时出错_Angularjs_Apache Spark_Apache Kafka_Pyspark_Spark Streaming - Fatal编程技术网

Angularjs 使用spark和kafka时使用齐柏林飞艇z.put函数时出错

Angularjs 使用spark和kafka时使用齐柏林飞艇z.put函数时出错,angularjs,apache-spark,apache-kafka,pyspark,spark-streaming,Angularjs,Apache Spark,Apache Kafka,Pyspark,Spark Streaming,我试图使用飞艇笔记本中的spark,kafka,代码如下 %pyspark import pyspark from pyspark.streaming.kafka import KafkaUtils from pyspark.streaming import StreamingContext ssc = StreamingContext(sc,60) broker = "127.0.0.1:9092" directKafkaStream = KafkaUtils.createDirectStr

我试图使用飞艇笔记本中的spark,kafka,代码如下

%pyspark

import pyspark
from pyspark.streaming.kafka import KafkaUtils
from pyspark.streaming import StreamingContext

ssc = StreamingContext(sc,60)
broker = "127.0.0.1:9092"
directKafkaStream = KafkaUtils.createDirectStream(ssc, ["testlogs"],{"metadata.broker.list": broker})
lines = directKafkaStream.map(lambda x: x[1])


f1 = lines.filter(lambda x: x[2])
f2 = f1.map(lambda x: (x.split(",")[4]))
f3 = f2.map(lambda x: (x.split(":")[1]))
f4 = f3.map(lambda x: x)
f5 = f4.filter(lambda x: x == "test string")
f6 = f5.count()
f6.pprint() 
#Above prints correct count on the console


z.put('m0_count', str(f6))

ssc.start()
ssc.awaitTerminationOrTimeout(300)


%spark

z.angularBind("m0_count", z.get("m0_count"))


%angular

<html>
<h2>Table</h2>
    <hr />
    <div class="row">
        <div class="col-md-6"><center><h3>my count</h3>{{m0_count}}</center></div>
    </div>
    <br />
</html>
导入pyspark
从pyspark.streaming.kafka导入KafkaUtils
从pyspark.streaming导入StreamingContext
ssc=StreamingContext(sc,60)
broker=“127.0.0.1:9092”
directKafkaStream=KafkaUtils.createDirectStream(ssc,[“testlogs”],{“metadata.broker.list”:broker})
lines=directKafkaStream.map(lambda x:x[1])
f1=lines.filter(λx:x[2])
f2=f1.map(λx:(x.split(“,”[4]))
f3=f2.map(λx:(x.split(“:”[1]))
f4=f3.map(λx:x)
f5=f4.过滤器(λx:x==“测试字符串”)
f6=f5.count()
f6.pprint()
#上面在控制台上打印正确的计数
z、 put('m0_计数',str(f6))
ssc.start()
ssc.等待终止RTIMEOUT(300)
%火花
z、 角度绑定(“m0_计数”,z.get(“m0_计数”))
%棱角的
桌子

我的计数{{m0_计数}
我的angularjs表中出现以下错误:
“0x7fdc797c3b90处的pyspark.streaming.dstream.TransformedStream对象”


有人能告诉我如何解决这个问题吗?

试试,
f7=str(f6)
然后
打印f7
看看打印出什么结果,我现在在控制台上打印出相同的错误。不确定如何将f6的值传递给z。请输入齐柏林飞艇。请给出建议。尝试,
f7=str(f6)
然后
print f7
查看打印出的内容我现在在控制台上打印出相同的错误。不确定如何将f6的值传递给z。请输入齐柏林飞艇。请给出建议。