Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/332.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 如何根据PySpark中的条件修改行的子集_Python_Pandas_Pyspark - Fatal编程技术网

Python 如何根据PySpark中的条件修改行的子集

Python 如何根据PySpark中的条件修改行的子集,python,pandas,pyspark,Python,Pandas,Pyspark,我正在尝试将所有MPa值转换为Pa。我在pandas中使用的代码如下所示。我如何将其转换为pyspark file_df.loc[file_df['Unit'] == 'MPa', 'Value'] = file_df['Value'] * 1000000 #coverts Value to Pa from MPa file_df.loc[file_df['Unit'] == 'MPa', 'Unit'] = 'Pa' # replace the MPa with Pa 当/时,您可以使用复制

我正在尝试将所有MPa值转换为Pa。我在pandas中使用的代码如下所示。我如何将其转换为pyspark

file_df.loc[file_df['Unit'] == 'MPa', 'Value'] = file_df['Value'] * 1000000 #coverts Value to Pa from MPa
file_df.loc[file_df['Unit'] == 'MPa', 'Unit'] = 'Pa' # replace the MPa with Pa

当/
时,您可以使用
复制这些就地分配,否则
如下所示:

from pyspark.sql.functions import when, col, lit

m = sparkdf.Unit == 'MPa'
(sparkdf.withColumn("Value", when(m, col('Value')*1000).otherwise(col('Value')))
        .withColumn("Unit",  when(m, lit('Pa')).otherwise(col('Unit'))))

小型工作示例:

df = pd.DataFrame({'Unit':['MPa', 'MPb', 'MPc'],
                   'Value':[5, 4, 3]})

sparkdf = spark.createDataFrame(df)
m = sparkdf.Unit == 'MPa'

(sparkdf.withColumn("Value",  when(m, col('Value')*1000).otherwise(col('Value')))
        .withColumn("Unit",  when(m, lit('Pa')).otherwise(col('Unit')))).show()

+----+-----+
|Unit|Value|
+----+-----+
|  Pa| 5000|
| MPb|    4|
| MPc|    3|
+----+-----+

/
时,您可以使用
复制这些就地分配,否则
如下所示:

from pyspark.sql.functions import when, col, lit

m = sparkdf.Unit == 'MPa'
(sparkdf.withColumn("Value", when(m, col('Value')*1000).otherwise(col('Value')))
        .withColumn("Unit",  when(m, lit('Pa')).otherwise(col('Unit'))))

小型工作示例:

df = pd.DataFrame({'Unit':['MPa', 'MPb', 'MPc'],
                   'Value':[5, 4, 3]})

sparkdf = spark.createDataFrame(df)
m = sparkdf.Unit == 'MPa'

(sparkdf.withColumn("Value",  when(m, col('Value')*1000).otherwise(col('Value')))
        .withColumn("Unit",  when(m, lit('Pa')).otherwise(col('Unit')))).show()

+----+-----+
|Unit|Value|
+----+-----+
|  Pa| 5000|
| MPb|    4|
| MPc|    3|
+----+-----+