Apache spark org.apache.spark.SparkException:无效的spark URL:spark://HeartbeatReceiver@xxxx_LPT-324:51380 PySpark

Apache spark org.apache.spark.SparkException:无效的spark URL:spark://HeartbeatReceiver@xxxx_LPT-324:51380 PySpark,apache-spark,pyspark,Apache Spark,Pyspark,试图使用PySpark创建SparkConf,但出现错误 代码 错误 我还设置了set SPARK\u LOCAL\u HOSTNAME=localhost 有人能帮我吗 from pyspark.python.pyspark.shell import spark from pyspark import SparkConf, SparkContext from pyspark.shell import sqlContext from pyspark.sql import SparkSession

试图使用PySpark创建
SparkConf
,但出现错误

代码

错误

我还设置了
set SPARK\u LOCAL\u HOSTNAME=localhost

有人能帮我吗

from pyspark.python.pyspark.shell import spark
from pyspark import SparkConf, SparkContext
from pyspark.shell import sqlContext
from pyspark.sql import SparkSession
      
conf = SparkConf().setAppName("Test-1 ETL").setMaster("local[*]").set("spark.driver.host", "localhost").set("spark.sql.execution.arrow.pyspark.enabled", "true")
sc = SparkContext(conf=conf)
org.apache.spark.SparkException: Invalid Spark URL: spark://HeartbeatReceiver@xxxx_LPT-324:51380