Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 列对象不可调用_Apache Spark_Pyspark - Fatal编程技术网

Apache spark 列对象不可调用

Apache spark 列对象不可调用,apache-spark,pyspark,Apache Spark,Pyspark,我试图安装spark并运行教程中给出的命令,但出现以下错误- P-MBP:spark-2.0.2-bin-hadoop2.4 prem$。/bin/pyspark Python 2.7.13(默认值,2017年4月4日,08:44:49) [GCC 4.2.1达尔文兼容苹果LLVM 7.0.2(clang-700.1.81)] 有关详细信息,请键入“帮助”、“版权”、“信用证”或“许可证”。 使用Spark的默认log4j配置文件:org/apache/Spark/log4j-defaults

我试图安装spark并运行教程中给出的命令,但出现以下错误-

P-MBP:spark-2.0.2-bin-hadoop2.4 prem$。/bin/pyspark
Python 2.7.13(默认值,2017年4月4日,08:44:49)
[GCC 4.2.1达尔文兼容苹果LLVM 7.0.2(clang-700.1.81)]
有关详细信息,请键入“帮助”、“版权”、“信用证”或“许可证”。
使用Spark的默认log4j配置文件:org/apache/Spark/log4j-defaults.properties
将默认日志级别设置为“警告”。
要调整日志记录级别,请使用sc.setLogLevel(newLevel)。
17/09/12 17:26:53警告NativeCodeLoader:无法为您的平台加载本机hadoop库。。。在适用的情况下使用内置java类
欢迎来到
____              __
/ __/__  ___ _____/ /__
_\ \/ _ \/ _ `/ __/  '_/
/__/.\uuu/\\ uuu//\ u/\\ u2.0.2版
/_/
使用Python版本2.7.13(默认值,2017年4月4日08:44:49)
SparkSession可用作“spark”。
>>>textFile=spark.read.text(“README.md”)
>>>textFile.count()
99
>>>textFile.first()
行(值=u'#Apache Spark')
>>>linesWithSpark=textFile.filter(textFile.value.contains(“Spark”))
回溯(最近一次呼叫最后一次):
文件“”,第1行,在
TypeError:“列”对象不可调用
>>>目录(textFile.value)
“UUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUU"获取切片","置乱","置乱","初始化","反转","迭代","置乱","置乱","置乱","模","模"","模"""""数","""","内"内"",“新的”和“新的”的“新的”和,“新的”和“的,”和“或”的,”或,“或”的,”或,“或”的,”或,他们的一些,他们的新的,新的,新的,新的,新的,新的,新的,新的,新的,新的,新的,新的,新的,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,","uuuu rtruediv","uuuu setattr","uuuu sizeof","uuu str","uuu sub","uuuu subclasshook","uuuuuu truediv","uu weakref uuu","uu jc","别名"ascitwiseAND、bitwiseOR、bitwiseXOR、cast、desc、endswith、getField、getItem、isNotNull、isNull、isin、like、name、Others、over、rlike、startswith、substr、when']

列。已在Spark 2.2()中添加了contains
方法。您使用的是Spark 2.0.2(),因此它不存在,
\uuuuu getattr\uuuuu
(点语法)解析为嵌套的

您可以使用类似于

textFile.filter(textFile.value.like("%Spark%"))
textFile.filter(textFile.value.like("%Spark%"))