Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Spark SQL-单查询JAVA中的多列求和_Java_Apache Spark_Apache Spark Sql - Fatal编程技术网

Spark SQL-单查询JAVA中的多列求和

Spark SQL-单查询JAVA中的多列求和,java,apache-spark,apache-spark-sql,Java,Apache Spark,Apache Spark Sql,我有50多列,我想使用spark SQL计算这些列的总和。我不想手动写入每个列名。我如何以编程方式执行它 val addNums = df.columns.map(case (c) => df(c)) .reduce(_ + _) val sumDF = df.select(expr(addNums).as("SumOfFifty")) 您能否共享示例数据和预期输出?

我有50多列,我想使用spark SQL计算这些列的总和。我不想手动写入每个列名。我如何以编程方式执行它

val addNums = df.columns.map(case (c) => df(c))
                .reduce(_ + _)

val sumDF =  df.select(expr(addNums).as("SumOfFifty"))

您能否共享示例数据和预期输出?