Apache spark 无法实例化spark SessionState

Apache spark 无法实例化spark SessionState,apache-spark,pyspark,Apache Spark,Pyspark,我在通过spark submit提交python程序时遇到了这个问题 df = (sc.binaryFiles(archives).map(lambda r: {'file_name': r[0], 'content' : bytearray(r[1])}).toDF(schema)) File "/usr/hdp/current/spark2-client/python/lib/pyspark.zip/pyspark/sql/session.py", line 57, in toD

我在通过spark submit提交python程序时遇到了这个问题

    df = (sc.binaryFiles(archives).map(lambda r: {'file_name': r[0], 'content' : bytearray(r[1])}).toDF(schema))
  File "/usr/hdp/current/spark2-client/python/lib/pyspark.zip/pyspark/sql/session.py", line 57, in toDF
  File "/usr/hdp/current/spark2-client/python/lib/pyspark.zip/pyspark/sql/session.py", line 526, in createDataFrame
  File "/usr/hdp/current/spark2-client/python/lib/py4j-0.10.3-src.zip/py4j/java_gateway.py", line 1133, in __call__
  File "/usr/hdp/current/spark2-client/python/lib/pyspark.zip/pyspark/sql/utils.py", line 79, in deco
pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating 'org.apache.spark.sql.internal.SessionState':"
17/05/02 10:25:01 INFO SparkContext: Invoking stop() from shutdown hook
我使用的是spark版本2.0.0.2.5.3.0-37、py4j版本0.10.3和python 2.7。你能告诉我怎么修吗