Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/18.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala UDF函数对数组列进行操作并返回自定义值_Scala_Apache Spark_User Defined Functions - Fatal编程技术网

Scala UDF函数对数组列进行操作并返回自定义值

Scala UDF函数对数组列进行操作并返回自定义值,scala,apache-spark,user-defined-functions,Scala,Apache Spark,User Defined Functions,我有两个这样的数据集 val jsonStr ="""{ "TransactionId": 1, "TransactionName": "Name", "Order": 12, "ReplaceStrings": [ "UNDEFINED",&quo

我有两个这样的数据集

val jsonStr ="""{
    
        "TransactionId": 1,
        "TransactionName": "Name",
        "Order": 12, 
        "ReplaceStrings": [
            "UNDEFINED","INVALID"
        ],
        "Country" : "China"           
     
}"""

val configurations = spark.read.json(Seq(jsonStr).toDS) 
这有我所有的配置和过滤器

My Data
val data =  Seq((1,"Mindy","Devaney","mdevaney0@cnbc.com","Female","United States","UTF-8"),(2,"Charmain","Clear","candriolli1@miitbeian.gov.cn","Female","**China**","UTF-8"),(3,"Dilan","**UNDEFINED**","dphilipeaux2@jalbum.net","Male","**China**","Windows-1252")).toDF("id","Fname","LName","mailid","Gender","Country","Codepage" )

现在,我的任务是将具有过滤器的配置数据连接起来,并在过滤器应用于中国国家时,使用上述数据检索相应的结果,所有未定义为值的LName将被替换为空字符串

我尝试使用一些UDF将其定义为函数,但仍停留在如何发送json值(包装数组)上,或者尝试使用Seq数据类型

如果有人看了类似的案例或想法,请与我分享。

检查下面的代码

scala> data.show(false)
+---+--------+-------------+----------------------------+------+-------------+------------+
|id |Fname   |LName        |mailid                      |Gender|Country      |Codepage    |
+---+--------+-------------+----------------------------+------+-------------+------------+
|1  |Mindy   |Devaney      |mdevaney0@cnbc.com          |Female|United States|UTF-8       |
|2  |Charmain|Clear        |candriolli1@miitbeian.gov.cn|Female|**China**    |UTF-8       |
|3  |Dilan   |**UNDEFINED**|dphilipeaux2@jalbum.net     |Male  |**China**    |Windows-1252|
+---+--------+-------------+----------------------------+------+-------------+------------+
检查下面的代码

scala> data.show(false)
+---+--------+-------------+----------------------------+------+-------------+------------+
|id |Fname   |LName        |mailid                      |Gender|Country      |Codepage    |
+---+--------+-------------+----------------------------+------+-------------+------------+
|1  |Mindy   |Devaney      |mdevaney0@cnbc.com          |Female|United States|UTF-8       |
|2  |Charmain|Clear        |candriolli1@miitbeian.gov.cn|Female|**China**    |UTF-8       |
|3  |Dilan   |**UNDEFINED**|dphilipeaux2@jalbum.net     |Male  |**China**    |Windows-1252|
+---+--------+-------------+----------------------------+------+-------------+------------+
scala> val check = udf((lname:String,replaceStrings:Seq[String]) => if(replaceStrings.map(d => s"**${d}**").contains(lname)) "" else lname )
scala> data.join(configurations,data("Country").contains(configurations("Country")),"inner").withColumn("LName",check($"LName",$"ReplaceStrings")).drop(configurations("Country")).show(false)
+---+--------+-----+----------------------------+------+---------+------------+-----+--------------------+-------------+---------------+
|id |Fname   |LName|mailid                      |Gender|Country  |Codepage    |Order|ReplaceStrings      |TransactionId|TransactionName|
+---+--------+-----+----------------------------+------+---------+------------+-----+--------------------+-------------+---------------+
|2  |Charmain|Clear|candriolli1@miitbeian.gov.cn|Female|**China**|UTF-8       |12   |[UNDEFINED, INVALID]|1            |Name           |
|3  |Dilan   |     |dphilipeaux2@jalbum.net     |Male  |**China**|Windows-1252|12   |[UNDEFINED, INVALID]|1            |Name           |
+---+--------+-----+----------------------------+------+---------+------------+-----+--------------------+-------------+---------------+