Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/364.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/unity3d/4.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 无法使用spring boot连接到远程服务器上安装的远程独立Apache spark_Java_Apache Spark - Fatal编程技术网

Java 无法使用spring boot连接到远程服务器上安装的远程独立Apache spark

Java 无法使用spring boot连接到远程服务器上安装的远程独立Apache spark,java,apache-spark,Java,Apache Spark,我想连接到ApacheSpark,它是从本地机器上运行的SpringBoot应用程序在远程服务器上设置的。我在网上看到了一些示例,但都在本地系统上配置了spark,并在spark.home中提供了可安装路径 我想要一个类似的设置,但不是spark.home指向本地系统,它应该连接到远程机器。任何帮助都将不胜感激 下面是代码 import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaSparkContext; i

我想连接到ApacheSpark,它是从本地机器上运行的SpringBoot应用程序在远程服务器上设置的。我在网上看到了一些示例,但都在本地系统上配置了spark,并在spark.home中提供了可安装路径

我想要一个类似的设置,但不是spark.home指向本地系统,它应该连接到远程机器。任何帮助都将不胜感激

下面是代码

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;

@SpringBootApplication

public class SpringSparkApplication {

    @Value("${spark.app.name}")
    private String appName;
    @Value("${spark.master}")
    private String masterUri;

    @Bean
    public SparkConf conf() {
        return new SparkConf().setAppName(appName).setMaster(masterUri);
    }

    @Bean
    public JavaSparkContext sc() {
        return new JavaSparkContext(conf());
    }

    public static void main(String[] args) {
        SpringApplication.run(SpringSparkApplication.class, args);
    }

}
这些物业包括:

spark.app.name=Ayush spark spark.master=spark://192.x.x.50:7077

将错误获取为:

2020-10-14 16:07:26.098 INFO 25028---[main]org.apache.spark.ui.SparkUI:将SparkUI绑定到0.0.0.0,并从http://amaheshwari-pc:4040 2020-10-14 16:07:26.187信息25028---[er-threadpool-0]s.d.c.StandaloneAppClient$ClientEndpoint:连接到主机spark://192.X.X.50:7077... 2020-10-14 16:07:26.449信息25028---[pc-connection-0]o.a.s.n.client.TransportClientFactory:在251毫秒(引导花费0毫秒)后成功创建到/192.X.X.50:7077的连接2020-10-14 16:07:46.205信息25028---[er-threadpool-0]s.d.c.StandaloneAppClient$ClientEndpoint:连接到主服务器spark://192.X.X.50:7077... 2020-10-14 16:08:06.207信息25028---[er-threadpool-0]s.d.c.StandaloneAppClient$ClientEndpoint:连接到主机spark://192.X.X.50:7077... 2020-10-14 16:08:26.221错误25028---[在重试线程上]o.a.s.s.c.StandalonesSchedulerBackend:应用程序已被终止。原因:所有的主人都没有反应!放弃。2020-10-14 16:08:26.222警告25028---[main]o.a.s.s.c.StandalonesSchedulerBackend:应用程序ID尚未初始化。2020-10-14 16:08:26.230信息25028---[p-spark-context]o.s.jetty.server.AbstractConnector:停止Spark@646427f7{HTTP/1.1[HTTP/1.1]}{0.0.0.0:4040}2020-10-14 16:08:26.231信息25028---[p-spark-context]org.apache.spark.ui.SparkUI:已在停止spark web uihttp://amaheshwari-pc:4040 2020-10-14 16:08:26.237信息25028---[p-spark-context]o.a.s.s.c.StandalonesSchedulerBackend:关闭所有执行器2020-10-14 16:08:26.238信息25028---[main]org.apache.spark.util.Utils:在端口56322上成功启动服务“org.apache.spark.network.netty.NettyBlockTransferService”。2020-10-14 16:08:26.239 INFO 25028---[main]o.a.s.n.netty.NettyBlockTransferService:在amaheshwari pc上创建的服务器:56322 2020-10-14 16:08:26.240 INFO 25028---[main]org.apache.spark.storage.BlockManager:使用org.apache.spark.storage.RandomBlockReplicationPolicy执行块复制策略2020-10-14 16:08:26.240 INFO 25028----[er-event-loop-8]seGrainedSchedulerBackend$DriverEndpoint:要求每个执行者关闭2020-10-14 16:08:26.243 WARN 25028---[r-event-loop-11]s.d.c.StandaloneAppClient$ClientEndpoint:删除未注册的应用程序(null),因为尚未连接到主2020-10-14 16:08:26.249 INFO 25028---[er-event-loop-5]o.a.s.MapOutputRackerMasterEndpoint:MapOutputRackerMasterEndpoint已停止!2020-10-14 16:08:26.256信息25028---[p-spark-context]o.a.spark.storage.memory.MemoryStore:MemoryStore已清除2020-10-14 16:08:26.257信息25028---[p-spark-context]org.apache.spark.storage.BlockManager:BlockManager stopped 2020-10-14 16:08:26.262 INFO 25028---[main]o.a.spark.storage.BlockManager管理员:注册BlockManager BlockManagerId(driver,amaheshwari-pc.Xsolutions.com,56322,None)2020-10-14 16:08:26.262 INFO 25028---[p-spark-context]o.a.spark.storage.BlockManagerMaster:BlockManagerMaster停止2020-10-14 16:08:26.263警告25028---[p-spark-context]org.apache.spark.metrics.MetricsSystem:停止未运行2020-10-14 16:08:26.263错误25028---[main]org.apache.spark.SparkContext:初始化SparkContext时出错

java.lang.NullPointerException:null位于org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:64)~[spark-core_2.12-2.4.0.jar:2.4.0]位于org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:252)~[spark-core_2.12-2.4.0]~[spark-core_2.12-2.4.0.jar:2.4.0]位于org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)[spark-core_2.12-2.4.0.jar:2.4.0]位于com.X.springspark.SpringSparkApplication.sc(SpringSparkApplication.java:25)[classes/:na]位于com.X.SpringSparkApplication$$enhancerbyspringsparkbringlib$CGLIB$5abe7298.CGLIB$sc:$1](classes/:na]在com.X.springspark.springspark应用程序$$EnhancerBySpringCGLIB$$5abe7298$$FastClassBySpringCGLIB$$81e62437.invoke()[classes/:na]位于org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:244)[spring-core-5.2.9.RELEASE.jar:5.2.9.RELEASE]在org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:331)[spring-context-5.2.9.RELEASE.jar:5.2.9.RELEASE]在com.X.springspark.SpringSparkApplication$$EnhancerBySpringCGLIB$$5abe7298.sc()[classes/:na]在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)~[na:1.8.0_261]在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)~[na:1.8.0_261]在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)~[na:1.8.0_261]在java.lang.reflect.invoke(Method.java:498)~[na:1.8.0_261]位于org.springframework.beans.factory.support.SimpleInstallationStrategy.instantiate(SimpleInstallationStrategy.java:154)[spring-beans-5.2.9.RELEASE.jar:5.2.9.RELEASE]位于org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:650)[-spring-beans-5.2.9.RELEASE:5.2.9.RELEASE]位于org.springframework.beans.factory.support.ConstructorResolver.InstantiationUsingFactoryMethod(ConstructorResolver.java:483)[spring-beans-5.2.9.RELEASE.jar:5.2.9.RELEASE]位于org.springframework.beans.factory.support.AbstractAu]