Dataframe spark数据帧在加入后未正确分割
我有3个spark数据帧,希望将它们加入BAQ列,并从division(Dataframe spark数据帧在加入后未正确分割,dataframe,apache-spark,join,Dataframe,Apache Spark,Join,我有3个spark数据帧,希望将它们加入BAQ列,并从division(dtfAvgEnd(“avg(AAG)”)/dtfAvgWeek(“avg(AAG)”)和dtfAvgLong(“avg(AAG)”)/dtfAvgWeek(“avg(AAG)”)创建2个新列 我得到了这个。所以Rati_长时间工作,但Rati_结束不了。我不明白出了什么问题 scala> dtfRatiConsSing. | show(20,false); +--------------------
dtfAvgEnd(“avg(AAG)”)/dtfAvgWeek(“avg(AAG)”
)和dtfAvgLong(“avg(AAG)”)/dtfAvgWeek(“avg(AAG)”)创建2个新列
我得到了这个。所以Rati_长时间工作,但Rati_结束不了。我不明白出了什么问题
scala> dtfRatiConsSing.
| show(20,false);
+----------------------+------------------+------------------+------------------+--------+------------------+
|BAQ |avg(AAG) |avg(AAG) |avg(AAG) |Rati_End|Rati_long |
+----------------------+------------------+------------------+------------------+--------+------------------+
|3310101041401034198668|147.66606060606063|58.360833333333346|121.46857142857142|1.0 |0.8225896386077629|
我重新命名了AVG列,它成功了。我无法用您提供的数据重现此问题。也许你的代码中有一个打印错误?作为一个更好的实践,考虑重新命名你的列,这样就不会有重复的列名。
scala> dtfRatiConsSing.
| show(20,false);
+----------------------+------------------+------------------+------------------+--------+------------------+
|BAQ |avg(AAG) |avg(AAG) |avg(AAG) |Rati_End|Rati_long |
+----------------------+------------------+------------------+------------------+--------+------------------+
|3310101041401034198668|147.66606060606063|58.360833333333346|121.46857142857142|1.0 |0.8225896386077629|
scala> dtfRatiCons.filter("BAQ='3310101041401034198668'").show(10,false);
+----------------------+------------------+-----------------+------------------+------------------+------------------+
|BAQ |AVGWeek |AVGEnd |AVGLong |Rati_End |Rati_long |
+----------------------+------------------+-----------------+------------------+------------------+------------------+
|3310101041401034198668|147.66606060606063|58.36083333333334|121.46857142857142|0.3952217123813354|0.8225896386077629|
+----------------------+------------------+-----------------+------------------+------------------+------------------+