Apache spark 尝试在Pyspark中保存和加载逻辑回归模型时出错

Apache spark 尝试在Pyspark中保存和加载逻辑回归模型时出错,apache-spark,model,pyspark,Apache Spark,Model,Pyspark,我已将我的输入数据拆分为train_df、test_df和val_df。我已经用train_df数据训练了我的模型,希望保存并加载它 我的代码: lr = LogisticRegression(maxIter=100) lrModel = lr.fit(train_df) predictions = lrModel.transform(val_df) evaluator = BinaryClassificationEvaluator(rawPredictionCol="rawPredicti

我已将我的输入数据拆分为train_df、test_df和val_df。我已经用train_df数据训练了我的模型,希望保存并加载它

我的代码:

lr = LogisticRegression(maxIter=100)
lrModel = lr.fit(train_df)

predictions = lrModel.transform(val_df)

evaluator = BinaryClassificationEvaluator(rawPredictionCol="rawPrediction")
print("Prediction : \n")
print(evaluator.evaluate(predictions))

accuracy = predictions.filter(predictions.label == predictions.prediction).count() / float(val_set.count())
print("Accuracy : \n")
print(accuracy)

lrModel.write().save("/home/vijay18/spark-2.1.0-bin-hadoop2.7/python/lrModel")
model = LogisticRegressionModel()
model.load("/home/vijay18/spark-2.1.0-bin-hadoop2.7/python/lrModel")
这就是我在终端上遇到的错误。错误的前三行用于保存模型。。其余的都是用来装的

错误:

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
18/07/17 20:04:01 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl

load
不打算在实例上调用。应该是

from pyspark.ml.classification import LogisticRegressionModel

LogisticRegressionModel.load(path)

现在错误:SLF4J:未能加载类“org.SLF4J.impl.StaticLoggerBinder”。SLF4J:默认为无操作(NOP)记录器实现SLF4J:有关更多详细信息,请参阅。18/07/16 20:41:56警告ParquetRecordReader:无法初始化计数器,因为上下文不是TaskInputOutContext的实例,而是org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl异常属性Error:“NoneType”对象在忽略中没有属性“detach”