Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 在AWS EMR笔记本上使用sagemaker_pyspark提供的XGBootsagemakeRestimator时出错_Python_Apache Spark_Pyspark_Amazon Emr_Amazon Sagemaker - Fatal编程技术网

Python 在AWS EMR笔记本上使用sagemaker_pyspark提供的XGBootsagemakeRestimator时出错

Python 在AWS EMR笔记本上使用sagemaker_pyspark提供的XGBootsagemakeRestimator时出错,python,apache-spark,pyspark,amazon-emr,amazon-sagemaker,Python,Apache Spark,Pyspark,Amazon Emr,Amazon Sagemaker,我正在尝试在EMR(Jupyter)笔记本上使用SageMaker Python SDK和PySpark。 尝试使用XGBoostSageMakerEstimator时,如下所示 from sagemaker_pyspark.algorithms import XGBoostSageMakerEstimator xgboost_estimator = XGBoostSageMakerEstimator( sagemakerRole=IAMRole(someRoleArn), t

我正在尝试在EMR(Jupyter)笔记本上使用SageMaker Python SDK和PySpark。 尝试使用XGBoostSageMakerEstimator时,如下所示

from sagemaker_pyspark.algorithms import XGBoostSageMakerEstimator

xgboost_estimator = XGBoostSageMakerEstimator(
    sagemakerRole=IAMRole(someRoleArn),
    trainingInstanceType='ml.m4.xlarge',
    trainingInstanceCount=1,
    endpointInstanceType='ml.m4.xlarge',
    endpointInitialInstanceCount=1)
我得到了以下错误,我无法找到解决方法

Exception ignored in: <bound method JavaWrapper.__del__ of <sagemaker_pyspark.wrapper.ScalaMap object at 0x7fd3d9e96240>>
Traceback (most recent call last):
  File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/ml/wrapper.py", line 40, in __del__
AttributeError: 'ScalaMap' object has no attribute '_java_obj'
Exception ignored in: <bound method JavaWrapper.__del__ of <sagemaker_pyspark.wrapper.ScalaMap object at 0x7fd3d9e96240>>
Traceback (most recent call last):
  File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/ml/wrapper.py", line 40, in __del__
AttributeError: 'ScalaMap' object has no attribute '_java_obj'
Exception ignored in: <bound method JavaWrapper.__del__ of <sagemaker_pyspark.wrapper.Option object at 0x7fd3d9e9d3c8>>
Traceback (most recent call last):
  File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/ml/wrapper.py", line 40, in __del__
AttributeError: 'Option' object has no attribute '_java_obj'
Exception ignored in: <bound method JavaWrapper.__del__ of <sagemaker_pyspark.wrapper.Option object at 0x7fd3d9e9d128>>
Traceback (most recent call last):
  File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/ml/wrapper.py", line 40, in __del__
AttributeError: 'Option' object has no attribute '_java_obj'
Exception ignored in: <bound method JavaWrapper.__del__ of <sagemaker_pyspark.wrapper.Option object at 0x7fd3d9e9d0f0>>
Traceback (most recent call last):
  File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/ml/wrapper.py", line 40, in __del__
AttributeError: 'Option' object has no attribute '_java_obj'
Exception ignored in: <bound method JavaWrapper.__del__ of <sagemaker_pyspark.wrapper.Option object at 0x7fd3d9e9d080>>
Traceback (most recent call last):
  File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/ml/wrapper.py", line 40, in __del__
AttributeError: 'Option' object has no attribute '_java_obj'
Exception ignored in: <bound method JavaWrapper.__del__ of <sagemaker_pyspark.wrapper.Option object at 0x7fd3d9e96ef0>>
Traceback (most recent call last):
  File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/ml/wrapper.py", line 40, in __del__
AttributeError: 'Option' object has no attribute '_java_obj'
中忽略的异常:
回溯(最近一次呼叫最后一次):
文件“/usr/lib/spark/python/lib/pyspark.zip/pyspark/ml/wrapper.py”,第40行,在__
AttributeError:“ScalaMap”对象没有属性“\u java\u obj”
在中忽略异常:
回溯(最近一次呼叫最后一次):
文件“/usr/lib/spark/python/lib/pyspark.zip/pyspark/ml/wrapper.py”,第40行,在__
AttributeError:“ScalaMap”对象没有属性“\u java\u obj”
在中忽略异常:
回溯(最近一次呼叫最后一次):
文件“/usr/lib/spark/python/lib/pyspark.zip/pyspark/ml/wrapper.py”,第40行,在__
AttributeError:“Option”对象没有属性“\u java\u obj”
在中忽略异常:
回溯(最近一次呼叫最后一次):
文件“/usr/lib/spark/python/lib/pyspark.zip/pyspark/ml/wrapper.py”,第40行,在__
AttributeError:“Option”对象没有属性“\u java\u obj”
在中忽略异常:
回溯(最近一次呼叫最后一次):
文件“/usr/lib/spark/python/lib/pyspark.zip/pyspark/ml/wrapper.py”,第40行,在__
AttributeError:“Option”对象没有属性“\u java\u obj”
在中忽略异常:
回溯(最近一次呼叫最后一次):
文件“/usr/lib/spark/python/lib/pyspark.zip/pyspark/ml/wrapper.py”,第40行,在__
AttributeError:“Option”对象没有属性“\u java\u obj”
在中忽略异常:
回溯(最近一次呼叫最后一次):
文件“/usr/lib/spark/python/lib/pyspark.zip/pyspark/ml/wrapper.py”,第40行,在__
AttributeError:“Option”对象没有属性“\u java\u obj”
如果您能帮助解决此问题,我们将不胜感激

使用:

  • 带Spark 2.4.3的EMR(EMR-5.26.0)群集
  • 连接到群集的EMR笔记本
  • sagemaker_pyspark预装了emr-5.26.0

    • 我遇到了同样的错误。我认为sagemaker_pyspark与Spark版本>2.3.2(来源:)不兼容。我能够与对该项目做出贡献的人确认这一点

      我用Spark 2.3.2运行代码,不再看到异常