Curl ApacheSparkRESTAPI

Curl ApacheSparkRESTAPI,curl,apache-spark,spark-jobserver,Curl,Apache Spark,Spark Jobserver,我正在使用用于log4j属性的spark submit命令调用spark submit,如下所示: /opt/spark-1.6.2-bin-hadoop2.6/bin/spark-submit \ --driver-java-options \ "-Dlog4j.configuration=file:/home/test_api/log4j-driver.properties\ --class Test testing.jar 如何通过curl(apachespark的隐藏restapi)提

我正在使用用于log4j属性的spark submit命令调用spark submit,如下所示:

/opt/spark-1.6.2-bin-hadoop2.6/bin/spark-submit \
--driver-java-options \
"-Dlog4j.configuration=file:/home/test_api/log4j-driver.properties\ --class Test testing.jar
如何通过curl(apachespark的隐藏restapi)提交作业

我试过这个:

curl -X POST http://host-ip:6066/v1/submissions/create --header "Content-Type:application/json;charset=UTF-8" --data '{
"action" : "CreateSubmissionRequest",
"appArgs" : [ "" ],
"appResource" : "hdfs://host-ip:9000/test/testing.jar",
"clientSparkVersion" : "1.6.2",
"environmentVariables" : {
"SPARK_ENV_LOADED" : "1"
},
"mainClass" : "Test",
"spark.driver.extraJavaOptions" : "-Dlog4j.configuration=file:/home/test_api/log4j-driver.properties",
"sparkProperties" : {
"spark.jars" : "hdfs://host-ip:9000/test/testing.jar",
"spark.app.name" : "Test",
"spark.eventLog.enabled": "true",
"spark.eventLog.dir": "hdfs://host-ip:9000/test/spark-events",
"spark.submit.deployMode" : "cluster",
"spark.master" : "spark://host-ip:7077"
}
}'
作业提交成功并给出响应,但有一个UKNOWN字段:

{
  "action" : "CreateSubmissionResponse",
  "message" : "Driver successfully submitted as driver-20160810210057-0091",
  "serverSparkVersion" : "1.6.2",
  "submissionId" : "driver-20160810210057-0091",
  "success" : true,
  "unknownFields" : [ "spark.driver.extraJavaOptions" ]
}
“未知领域”:[“spark.driver.extraJavaOptions”]

我还尝试了以下
driverExtraJavaOptions

curl -X POST http://host-ip:6066/v1/submissions/create --header "Content-Type:application/json;charset=UTF-8" --data '{
"action" : "CreateSubmissionRequest",
"appArgs" : [ "" ],
"appResource" : "hdfs://host-ip:9000/test/testing.jar",
"clientSparkVersion" : "1.6.2",
"environmentVariables" : {
"SPARK_ENV_LOADED" : "1"
},
"mainClass" : "Test",
"driverExtraJavaOptions" : "-Dlog4j.configuration=file:/home/test_api/log4j-driver.properties",
"sparkProperties" : {
"spark.jars" : "hdfs://host-ip:9000/test/testing.jar",
"spark.app.name" : "Test",
"spark.eventLog.enabled": "true",
"spark.eventLog.dir": "hdfs://host-ip:9000/test/spark-events",
"spark.submit.deployMode" : "cluster",
"spark.master" : "spark://host-ip:7077"
}
}'
但得到了类似的回应:

{
  "action" : "CreateSubmissionResponse",
  "message" : "Driver successfully submitted as driver-20160810211432-0094",
  "serverSparkVersion" : "1.6.2",
  "submissionId" : "driver-20160810211432-0094",
  "success" : true,
  "unknownFields" : [ "driverExtraJavaOptions" ]
}
这是为什么?

我查看并引用了

它现在可以工作,将Dlog4j.configuration=file://(///本地文件的路径)和sparkproperty中的spark.driver.extraJavaOptions放入

curl -X POST http://host-ip:6066/v1/submissions/create --header "Content-Type:application/json;charset=UTF-8" --data '{
"action" : "CreateSubmissionRequest",
"appArgs" : [ "" ],
"appResource" : "hdfs://host-ip:9000/test/testing.jar",
"clientSparkVersion" : "1.6.2",
"environmentVariables" : {
"SPARK_ENV_LOADED" : "1"
},
"mainClass" : "Test",
"sparkProperties" : {
"spark.jars" : "hdfs://host-ip:9000/test/testing.jar",
"spark.driver.extraJavaOptions" : "-Dlog4j.configuration=file:///home/log4j-driver.properties",
"spark.app.name" : "Test",
"spark.eventLog.enabled": "true",
"spark.eventLog.dir": "hdfs://host-ip:9000/test/spark-events",
"spark.submit.deployMode" : "client",
"spark.master" : "spark://host-ip:7077"
}
}'

是否在独立群集上?我想使用Spark Rest,但Spark Rest似乎对纱线不起作用。是的,我处于独立模式,谢谢