Docker Spark submit在Kubernetes(EKS)上失败,并带有;无效的空输入:name";

Docker Spark submit在Kubernetes(EKS)上失败,并带有;无效的空输入:name";,docker,apache-spark,kubernetes,amazon-eks,Docker,Apache Spark,Kubernetes,Amazon Eks,我正在尝试在EKS上运行spark示例docker映像。我的Spark版本是3.0。 我创建了spark serviceaccount和角色绑定。提交作业时,出现以下错误: 2020-07-05T12:19:40.862635502Z线程“main”java.io中出现异常。IOException:登录失败 2020-07-05T12:19:40.862756537Z,网址:org.apache.hadoop.security.UserGroupInformation.loginUserFrom

我正在尝试在EKS上运行spark示例docker映像。我的Spark版本是3.0。
我创建了spark serviceaccount和角色绑定。提交作业时,出现以下错误:

2020-07-05T12:19:40.862635502Z线程“main”java.io中出现异常。IOException:登录失败
2020-07-05T12:19:40.862756537Z,网址:org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:841)
2020-07-05T12:19:40.862772672Z,网址:org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:777)
2020-07-05T12:19:40.862777401Z,网址:org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:650)
2020-07-05T12:19:40.862788327Z在org.apache.spark.util.Utils$.$anonfun$getCurrentUserName$1(Utils.scala:2412)
2020-07-05T12:19:40.862792294Z在scala.Option.getOrElse(Option.scala:189)
2020-07-05T12:19:40.8628321Z,网址:org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2412)
2020-07-05T12:19:40.862836906Z位于org.apache.spark.deploy.k8s.features.BasicDriverFeatureStep.configurePod(BasicDriverFeatureStep.scala:119)
2020-07-05T12:19:40.862907673Z位于org.apache.spark.deploy.k8s.submit.KubernetesDriverBuilder.$anonfun$buildfromformfeatures$3(KubernetesDriverBuilder.scala:59)
2020-07-05T12:19:40.862917119Z在scala.集合.线性化QoOptimized.foldLeft(线性化QoOptimized.scala:126)
2020-07-05T12:19:40.86294845Z在scala.collection.LinearSeQoOptimized.foldLeft$(LinearSeQoOptimized.scala:122)
2020-07-05T12:19:40.862964245Z位于scala.collection.immutable.List.foldLeft(List.scala:89)
2020-07-05T12:19:40.862979665Z,网址:org.apache.spark.deploy.k8s.submit.KubernetesDriverBuilder.buildFromFeatures(KubernetesDriverBuilder.scala:58)
2020-07-05T12:19:40.863055425Z,网址为org.apache.spark.deploy.k8s.submit.Client.run(KubernetesClientApplication.scala:98)
2020-07-05T12:19:40.863060434Z在org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.$anonfun$run$4(KubernetesClientApplication.scala:221)
2020-07-05T12:19:40.86309062Z,位于org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.$anonfun$run$4$adapted(KubernetesClientApplication.scala:215)
2020-07-05T12:19:40.863103831Z,网址:org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2539)
2020-07-05T12:19:40.863163804Z,网址为org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:215)
2020-07-05T12:19:40.863168546Z,网址:org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:188)
2020-07-05T12:19:40.863194449Z在org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
2020-07-05T12:19:40.863218817Z在org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
2020-07-05T12:19:40.863246594Z,位于org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
2020-07-05T12:19:40.863252341Z,网址:org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
2020-07-05T12:19:40.863277236Z在org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
2020-07-05T12:19:40.863314173Z,网址为org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
2020-07-05T12:19:40.863319847Z,位于org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2020-07-05T12:19:40.863653699Z由以下原因引起:javax.security.auth.login.login异常:java.lang.NullPointerException:无效的空输入:名称
2020-07-05T12:19:40.863660447Z在com.sun.security.auth.UnixPrincipal.(UnixPrincipal.java:71)
2020-07-05T12:19:40.863663683Z在com.sun.security.auth.module.unixlogimodule.login(unixlogimodule.java:133)
2020-07-05T12:19:40.863667173Z在太阳光下反射本机方法附件调用0(本机方法)
2020-07-05T12:19:40.863670199Z在太阳上反射NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2020-07-05T12:19:40.863673467Z在太阳上反射DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2020-07-05T12:19:40.86367674Z位于java.lang.reflect.Method.invoke(Method.java:498)
2020-07-05T12:19:40.863680205Z在javax.security.auth.login.LoginContext.invoke(LoginContext.java:755)
2020-07-05T12:19:40.863683401Z,javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
2020-07-05T12:19:40.86368671Z,javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
2020-07-05T12:19:40.863689794Z,javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
2020-07-05T12:19:40.863693081Z在java.security.AccessController.doPrivileged(本机方法)
2020-07-05T12:19:40.863696183Z,javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
2020-07-05T12:19:40.863698579Z在javax.security.auth.login.LoginContext.login(LoginContext.java:587)
2020-07-05T12:19:40.863700844Z位于org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:815)
2020-07-05T12:19:40.86370333z,网址:org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:777)
2020-07-05T12:19:40.86370659Z,网址:org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:650)
2020-07-05T12:19:40.8637098099Z在org.apache.spark.util.Utils$.$anonfun$getCurrentUserName$1(Utils.scala:2412)
2020-07-05T12:19:40.863712847Z在scala选项getOrElse(选项scala:189)
2020-07-05T12:19:40.863716102Z,网址:org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2412)
2020-07-05T12:19:40.863719273Z位于org.apache.spark.deploy.k8s.features.BasicDriverFeatureStep.configurePod(BasicDriverFeatureStep.scala:119)
2020-07-05T12:19:40.86372651Z在org.apache.spark.deploy.k8s.submit.KubernetesDriverBuilder.anonfun$buildfromformfeatures$3(KubernetesDriverBuilder.scala:59)
2020-07-05T12:19:40.863728947Z在scala.集合.线性化QoOptimized.foldLeft(线性化QoOptimized.scala:126)
2020-07-05T12:19:40.863731207Z斯卡拉科勒
docker build -t spark:latest -f kubernetes/dockerfiles/spark/Dockerfile .
USER ${spark_uid}
      containers:
        - name: spark-pi
          image: <registry>/spark-pi-3.0
      args: [
            "/bin/sh",
            "-c",
            "/opt/spark/bin/spark-submit \
            --master k8s://https://10.100.0.1:443 \
            --deploy-mode cluster ..."
      ]

export SPARK_USER=spark3