Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/357.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 将UDF后的新列追加到现有PySpark数据帧_Python_Dataframe_Merge_Pyspark_Bigdata - Fatal编程技术网

Python 将UDF后的新列追加到现有PySpark数据帧

Python 将UDF后的新列追加到现有PySpark数据帧,python,dataframe,merge,pyspark,bigdata,Python,Dataframe,Merge,Pyspark,Bigdata,我有下面的dataframe示例 +-------+--------+--------+--------+ | data1 | data 2 | data 3 | data 4 | +-------+--------+--------+--------+ |1 |abc |abd |3 | +-------+--------+--------+--------+ |3 |abd |abd |3 | +-------+--

我有下面的dataframe示例

+-------+--------+--------+--------+
| data1 | data 2 | data 3 | data 4 |
+-------+--------+--------+--------+
|1      |abc     |abd     |3       |
+-------+--------+--------+--------+
|3      |abd     |abd     |3       |
+-------+--------+--------+--------+
|2      |abe     |abg     |2       |
例如,我正在应用一个UDF,它将数据4转换为
True
if 3和
False
if 2

我使用以下代码生成一个独立的数据帧,其中包含列中的新旧值:

UDF = udf(converterFnc,StringType())
tempDF = mydata.select('data 4', UDF('data 4').alias('newdata 4'))
并获取以下数据帧:

+--------+-----------+
| data 4 | newdata 4 |
+--------+-----------+
| 3      | True      |
+--------+-----------+
| 2      | False     |
我试图找出如何将其合并回原始数据帧,但我发现使用
join
时遇到了一个奇怪的问题,在这个问题中,所有连接的值都只是整个数据帧的第一个值

+-------+--------+--------+--------+
| data1 | data 2 | data 3 | data 4 |
+-------+--------+--------+--------+
|1      |abc     |abd     |3       |
+-------+--------+--------+--------+
|3      |abd     |abd     |3       |
+-------+--------+--------+--------+
|2      |abe     |abg     |2       |
我的期望输出:

+-------+--------+--------+--------+-----------+
| data1 | data 2 | data 3 | data 4 | newdata 4 |
+-------+--------+--------+--------+-----------+
|1      |abc     |abd     |3       | True      |
+-------+--------+--------+--------+-----------+
|3      |abd     |abd     |3       | True      |
+-------+--------+--------+--------+-----------+
|2      |abe     |abg     |2       | False     |

谢谢大家!

您可以使用
with column
when。否则
在不加入
过程的情况下创建新列:

import pyspark.sql.functions as F
df.withColumn("newdata 4", F.when(df["data 4"] == 3, True).otherwise(F.when(df["data 4"] == 2, False))).show()
+-----+------+------+------+---------+
|data1|data 2|data 3|data 4|newdata 4|
+-----+------+------+------+---------+

|    1|   abc|   abd|     3|     true|
|    3|   abd|   abd|     3|     true|
|    2|   abe|   abg|     2|    false|
+-----+------+------+------+---------+

谢谢我的方法比一个简单的
True/False
分类器要复杂一些,但是
withColumn
方法我能够让它工作!酷。很高兴有帮助!