Apache spark 如何将spark安装为守护进程
我在两台机器上以主控和从控的方式启动spark:Apache spark 如何将spark安装为守护进程,apache-spark,ubuntu,systemd,Apache Spark,Ubuntu,Systemd,我在两台机器上以主控和从控的方式启动spark: 然后我为它们中的每一个创建systemd.service,但当我将它们作为服务启动时,它们无法启动。以下是我的systemctl状态: ● sparkslave.service - Spark Slave Loaded: loaded (/etc/systemd/system/sparkslave.service; enabled; ven dor preset: enabled) Active: inactive (dead) s
然后我为它们中的每一个创建systemd.service,但当我将它们作为服务启动时,它们无法启动。以下是我的systemctl状态:
● sparkslave.service - Spark Slave
Loaded: loaded (/etc/systemd/system/sparkslave.service; enabled; ven
dor preset: enabled)
Active: inactive (dead) since Mon 2019-12-09 07:30:22 EST; 55s ago
Process: 31680 ExecStart=/usr/lib/spark/sbin/start-slave.sh spark://1
72.16.3.90:7077 (code=exited, status=0/SUCCESS)
Main PID: 31680 (code=exited, status=0/SUCCESS)
Dec 09 07:30:19 SparkSlave1 systemd[1]: Started Spark Slave.
Dec 09 07:30:19 SparkSlave1 start-slave.sh[31680]: starting org.apache.
spark.deploy.worker.Worker, logging to /usr/lib/spark/logs/spark-spark-
user-org.apache.spark.deploy.worker.Worker-1-SparkSlave1.out
这是我的sparkslave.service:
[Unit]
Description=Spark Slave
After=network.target
[Service]
User=spark-user
WorkingDirectory=/usr/lib/spark/sbin
ExecStart=/usr/lib/spark/sbin/start-slave.sh spark://172.16.3.90:7077
Restart=on-failure
RestartSec=10s
[Install]
WantedBy=multi-user.target
问题是什么?服务类型必须从简单更改为分叉:
[Service]
Type=forking