Python PySpark-条件语句

Python PySpark-条件语句,python,apache-spark,pyspark,Python,Apache Spark,Pyspark,我是PySpark的新手,不知道您能否指导我如何将以下SAS代码转换为PySpark SAS代码: If ColA > Then Do; If ColB Not In ('B') and ColC <= 0 Then Do; New_Col = Sum(ColA, ColR, ColP); End; Else Do; New_Col = Sum(ColA + ColR) End; End; Else Do; I

我是PySpark的新手,不知道您能否指导我如何将以下SAS代码转换为PySpark

SAS代码:

If ColA > Then Do;
    If ColB Not In ('B') and ColC <= 0 Then Do;
         New_Col = Sum(ColA, ColR, ColP);
    End;
    Else Do;
         New_Col = Sum(ColA + ColR)
    End;
End;
Else Do;
   If ColB Not in ('B') and ColC <= 0 then do;
     New_Col = Sum(ColR, ColP);
   end;
   Else Do;
     New_Col = ColR;
   End;
End;
如果可乐>那么做;

如果ColB不在('B')中,并且ColC您的代码与需要的代码一样好,那么这些条件应该用括号括起来

从pyspark.sql导入函数为F
(df)
.withColumn('New_Col',F
.when((F.col('ColA')>0)和(F.col('ColB')).isin(['B'])==False)和(F.col('ColC'))
df.withColumn('New_Col', when(ColA > 0 & ColB.isin(['B']) == False & ColC <= 0, col('ColA') + Col('ColR') + Col('ColP'))
...
...