Scala 如何在二进制字段上联接?

Scala 如何在二进制字段上联接?,scala,apache-spark,apache-spark-sql,apache-spark-1.6,Scala,Apache Spark,Apache Spark Sql,Apache Spark 1.6,在Scala/Spark中,我尝试执行以下操作: val portCalls_Ports = portCalls.join(ports, portCalls("port_id") === ports("id"), "inner") 但是,我得到以下错误: Exception in thread "main" org.apache.spark.sql.AnalysisException: binary type expression port_id cannot be used

在Scala/Spark中,我尝试执行以下操作:

val portCalls_Ports = 
  portCalls.join(ports, portCalls("port_id") === ports("id"), "inner")
但是,我得到以下错误:

Exception in thread "main" org.apache.spark.sql.AnalysisException: 
     binary type expression port_id cannot be used in join conditions;
这确实是一种二进制类型:

root
 |-- id: binary (nullable = false)
 |-- port_id: binary (nullable = false)
     .
     .
     .

+--------------------+--------------------+
|                  id|             port_id|
+--------------------+--------------------+
|[FB 89 A0 FF AA 0...|[B2 B2 84 B9 52 2...|
按原样
端口(“id”)

我正在使用以下库:

scalaVersion := "2.11.11"
libraryDependencies ++= Seq(
  // Spark dependencies
  "org.apache.spark" %% "spark-hive" % "1.6.2",
  "org.apache.spark" %% "spark-mllib" % "1.6.2",
  // Third-party libraries
  "postgresql" % "postgresql" % "9.1-901-1.jdbc4",
  "net.sf.jopt-simple" % "jopt-simple" % "5.0.3"
)
注意,我使用JDBC读取数据库表


解决此问题的最佳方法是什么?

PreSpark 2.1.0,据我所知,最好的解决方法是使用
base64
函数将二进制列转换为字符串,并进行比较:

import org.apache.spark.sql.functions._

val portCalls_Ports =
  portCalls.join(ports, base64(portCalls("port_id")) === base64(ports("id")), "inner")

二进制类型表达式可以在Spark 2.1.0的连接条件中使用,但不能在之前的版本中使用。我删除了jdbc标记,因为这个问题似乎纯粹是Spark内部的问题,与使用jdbc无关。抱歉-编辑了文章以包含导入;我建议养成将此导入添加到每个与数据帧相关的代码段的习惯;)