Apache spark 使用java向spark发布应用程序时遇到问题
当我运行spark示例时,我得到了这个错误 20/09/04 17:06:35警告TaskSchedulerImpl:初始作业未接受任何资源;检查集群UI以确保工作人员已注册并拥有足够的资源 我在本地运行这个应用程序,我的spark服务器部署在Apache spark 使用java向spark发布应用程序时遇到问题,apache-spark,Apache Spark,当我运行spark示例时,我得到了这个错误 20/09/04 17:06:35警告TaskSchedulerImpl:初始作业未接受任何资源;检查集群UI以确保工作人员已注册并拥有足够的资源 我在本地运行这个应用程序,我的spark服务器部署在47.111.185.105 public class WordCountApp { private static final Logger logger = LoggerFactory.getLogger(WordCountApp.class)
47.111.185.105
public class WordCountApp {
private static final Logger logger = LoggerFactory.getLogger(WordCountApp.class);
public static void main(String[] args) {
// Should be some file on your system
String logFile = "src/main/resources/people.json";
SparkConf conf = new SparkConf().set("spark.shuffle.service.enabled", "false")
.set("spark.dynamicAllocation.enabled", "false")
.set("spark.cores.max", "1")
.set("spark.executor.instances","2")
.set("spark.executor.memory","500m")
.set("spark.executor.cores","1")
.setMaster("spark://47.111.185.105:7077");
//.set("deploy-mode", "client");
SparkSession spark = SparkSession.builder()
.appName("Word Count Application")
.config(conf)
.getOrCreate();
Dataset<String> logData = spark.read().textFile(logFile).cache();
System.out.println("Spark version = " + spark.version());
logger.info("Spark version = " + spark.version());
long numAs = logData.filter((FilterFunction<String>) s -> s.contains("a")).count();
long numBs = 3;//logData.filter((FilterFunction<String>) s -> s.contains("b")).count();
System.out.println("Lines with a: " + numAs + ", lines with b: " + numBs);
logger.info("Lines with a: " + numAs + ", lines with b: " + numBs);
spark.stop();
}
}
公共类WordCountApp{
私有静态最终记录器Logger=LoggerFactory.getLogger(WordCountApp.class);
公共静态void main(字符串[]args){
//系统上应该有一些文件
字符串logFile=“src/main/resources/people.json”;
SparkConf conf=new SparkConf().set(“spark.shuffle.service.enabled”,“false”)
.set(“spark.DynamicLocation.enabled”、“false”)
.set(“火花线芯最大值”,“1”)
.set(“spark.executor.instances”,“2”)
.set(“spark.executor.memory”,“500米”)
.set(“spark.executor.cores”、“1”)
.setMaster(“spark://47.111.185.105:7077");
//.set(“部署模式”、“客户端”);
SparkSession spark=SparkSession.builder()
.appName(“字数计算应用程序”)
.config(conf)
.getOrCreate();
数据集logData=spark.read().textFile(logFile.cache();
System.out.println(“Spark version=“+Spark.version());
logger.info(“Spark version=“+Spark.version());
long numAs=logData.filter((FilterFunction)s->s.contains(“a”)).count();
long numBs=3;//logData.filter((FilterFunction)s->s.contains(“b”).count();
System.out.println(“带a的行:“+numAs+”,带b的行:“+numBs”);
logger.info(“带a的行:“+numAs+”,带b的行:“+numBs”);
spark.stop();
}
}
- 打包并上传jar文件,使用sparkshell很好
- 检查剩余资源是否足够
- 尝试重新启动火花