Macos 在mac os x上以编程方式启动Simple Spark

Macos 在mac os x上以编程方式启动Simple Spark,macos,scala,apache-spark,Macos,Scala,Apache Spark,我正在尝试运行一些简单的、过去运行起来相当容易的东西,但是我遇到了在代码下面运行的问题(spark版本:2.4.4/2.4.3) import org.apache.spark.SparkConf 导入org.apache.spark.sql.SparkSession 对象采样器扩展应用程序{ SparkSession .builder() .config( 新SparkConf() .setAppName(appName) .setMaster(master) .set(“spark.ui.e

我正在尝试运行一些简单的、过去运行起来相当容易的东西,但是我遇到了在代码下面运行的问题(spark版本:2.4.4/2.4.3)

import org.apache.spark.SparkConf
导入org.apache.spark.sql.SparkSession
对象采样器扩展应用程序{
SparkSession
.builder()
.config(
新SparkConf()
.setAppName(appName)
.setMaster(master)
.set(“spark.ui.enabled”、“false”))
.getOrCreate()
}
我得到这个例外-

java.lang.NumberFormatException: For input string: "unknown"
    at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
    at java.lang.Long.parseLong(Long.java:589)
    at java.lang.Long.valueOf(Long.java:803)
    at org.spark_project.jetty.util.Jetty.formatTimestamp(Jetty.java:89)
    at org.spark_project.jetty.util.Jetty.<clinit>(Jetty.java:61)
    at org.spark_project.jetty.server.Server.getVersion(Server.java:159)
    at org.spark_project.jetty.server.handler.ContextHandler.<clinit>(ContextHandler.java:128)
    at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:143)
    at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:130)
    at org.apache.spark.metrics.sink.MetricsServlet.getHandlers(MetricsServlet.scala:53)
    at org.apache.spark.metrics.MetricsSystem.$anonfun$getServletHandlers$2(MetricsSystem.scala:92)
    at scala.Option.map(Option.scala:230)
    at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:92)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:516)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$5(SparkSession.scala:935)
    at scala.Option.getOrElse(Option.scala:189)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
    at SparkGen$.<init>(SparkGen.scala:27)
    at SparkGen$.<clinit>(SparkGen.scala)
    at com.wd.perf.collector.metamodel.gen.Sampler$.delayedEndpoint$com$wd$perf$collector$metamodel$gen$Sampler$1(Sampler.scala:10)
    at Sampler$delayedInit$body.apply(Sampler.scala:5)
    at scala.Function0.apply$mcV$sp(Function0.scala:39)
    at scala.Function0.apply$mcV$sp$(Function0.scala:39)
    at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
    at scala.App.$anonfun$main$1$adapted(App.scala:80)
    at scala.collection.immutable.List.foreach(List.scala:392)
    at scala.App.main(App.scala:80)
    at scala.App.main$(App.scala:78)
    at Sampler$.main(Sampler.scala:5)
    at Sampler.main(Sampler.scala)

上面是禁用spark.ui的片段,它似乎已经起作用了,但不再起作用了。欢迎任何建议。

这也让我抓狂,最后我通过将
logback.xml
文件添加到
resources
目录中,使其正常工作

<configuration>
    <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
        <encoder>
            <pattern>%d{HH:mm:ss.SSS} %-5level %logger{36} - %msg%n</pattern>
        </encoder>
    </appender>

    <root level="info">
        <appender-ref ref="STDOUT" />
    </root>
</configuration>

%d{HH:mm:ss.SSS}%-5级%logger{36}-%msg%n

但是,我仍然不知道根本原因,因为Jetty正在寻找一个不存在的构建参数
timestamp
,因此它无法将litersl
unknown
转换为Long。这与日志框架之间有什么关系?不知道。

这解决了我的问题,thanks@ZhangChen如果这个问题对你有帮助,请把答案投赞成票
<configuration>
    <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
        <encoder>
            <pattern>%d{HH:mm:ss.SSS} %-5level %logger{36} - %msg%n</pattern>
        </encoder>
    </appender>

    <root level="info">
        <appender-ref ref="STDOUT" />
    </root>
</configuration>