Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/19.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何将嵌套的scala集合转换为嵌套的Java集合_Java_Scala_Generics_Scala Collections - Fatal编程技术网

如何将嵌套的scala集合转换为嵌套的Java集合

如何将嵌套的scala集合转换为嵌套的Java集合,java,scala,generics,scala-collections,Java,Scala,Generics,Scala Collections,我在Scala和Java之间遇到编译问题 我的Java代码需要一个 java.util.Map<Double, java.lang.Iterable<Foo>> 我得到编译错误: error: type mismatch; found : scala.collection.immutable.Map[scala.Double,Vector[Foo] required: java.util.Map[java.lang.Double,java.lang.Iterable

我在Scala和Java之间遇到编译问题

我的Java代码需要一个

java.util.Map<Double, java.lang.Iterable<Foo>>
我得到编译错误:

error: type mismatch;
found   : scala.collection.immutable.Map[scala.Double,Vector[Foo]
required: java.util.Map[java.lang.Double,java.lang.Iterable[Foo]]

似乎scala.collection.JavaConversions不适用于嵌套集合,即使向量可以隐式转换为Iterable。除了迭代scala集合并手动进行转换之外,我还能做些什么来让类型正常工作吗?

scala.collection.JavaConversions
应该被弃用。最好使用
scala.collection.JavaConverters
,明确转换发生的时间和地点。就你而言:

import scala.collection.JavaConverters._

type Foo = Int // Just to make it compile
val scalaMap = Map(1.0 -> Vector(1, 2)) // As an example

val javaMap = scalaMap.map { 
  case (d, v) => d -> v.toIterable.asJava
}.asJava

我编写了这个通用函数,它非常适合我的需要

def toJava(x: Any): Any = {
  import scala.collection.JavaConverters._
  x match {
    case y: scala.collection.MapLike[_, _, _] => 
      y.map { case (d, v) => toJava(d) -> toJava(v) } asJava
    case y: scala.collection.SetLike[_,_] => 
      y map { item: Any => toJava(item) } asJava
    case y: Iterable[_] => 
      y.map { item: Any => toJava(item) } asJava
    case y: Iterator[_] => 
      toJava(y.toIterable)
    case _ => 
      x
  }
}

这更适合我的需要:

  def toJava(m: Any): Any = {
    import java.util
    import scala.collection.JavaConverters._
    m match {
      case sm: Map[_, _] => sm.map(kv => (kv._1, toJava(kv._2))).asJava
      case sl: Iterable[_] => new util.ArrayList(sl.map( toJava ).asJava.asInstanceOf[util.Collection[_]])
      case _ => m
    }
  }

如果有人在spark scala中寻找解决方案,请尝试此方法

导入org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema

这里,, y是嵌套包装器数组

y match {
          case x : WrappedArray[x] =>
             (x.map(f => f match {case z: GenericRowWithSchema => z.mkString(",").toString()
                                                case z:Any => z  })).asJavaCollection
          case _ => row.get(i).asInstanceOf[Object]
        }
上面的代码做了两件事, 1) 若包装器数组具有基元数据类型,则条件case_uu将通过
2) 如果包装器数组具有复杂的数据类型(比如struts),则执行case GenericRowWithSchema。

所有其他解决方案都是
Any
Any
,这对于Scala这样的强类型语言来说是非常糟糕的。
下面是一个尽可能多地保留类型的解决方案:

trait AsJava[T, R] {
  def apply(o: T): R
}

object AsJava extends LowPriorityAsJava {
  implicit class RecursiveConverter[T](o: T) {
    def asJavaRecursive[R](implicit asJava: AsJava[T, R]): R = asJava(o)
  }

  implicit lazy val longAsJava: AsJava[Long, lang.Long] = new AsJava[Long, lang.Long] {
    def apply(o: Long): lang.Long = Long.box(o)
  }

  implicit lazy val intAsJava: AsJava[Int, lang.Integer] = new AsJava[Int, lang.Integer] {
    def apply(o: Int): lang.Integer = Int.box(o)
  }

  implicit lazy val doubleAsJava: AsJava[Double, lang.Double] = new AsJava[Double, lang.Double] {
    def apply(o: Double): lang.Double = Double.box(o)
  }

  implicit def mapAsJava[K, V, KR, VR](
      implicit
      keyAsJava: AsJava[K, KR],
      valueAsJava: AsJava[V, VR]
  ): AsJava[Map[K, V], util.Map[KR, VR]] =
    new AsJava[Map[K, V], util.Map[KR, VR]] {
      def apply(map: Map[K, V]): util.Map[KR, VR] =
        map.map { case (k, v) => (keyAsJava(k), valueAsJava(v)) }.asJava
    }

  implicit def seqAsJava[V, VR](implicit valueAsJava: AsJava[V, VR]): AsJava[Seq[V], util.List[VR]] =
    new AsJava[Seq[V], util.List[VR]] {
      def apply(seq: Seq[V]): util.List[VR] = seq.map(valueAsJava(_)).asJava
    }

  implicit def setAsJava[V, VR](implicit valueAsJava: AsJava[V, VR]): AsJava[Set[V], util.Set[VR]] =
    new AsJava[Set[V], util.Set[VR]] {
      def apply(set: Set[V]): util.Set[VR] = set.map(valueAsJava(_)).asJava
    }

  implicit lazy val anyAsJava: AsJava[Any, AnyRef] = new AsJava[Any, AnyRef] {
    def apply(o: Any): AnyRef = o match {
      case x: Map[Any, Any] => mapAsJava(anyAsJava, anyAsJava)(x)
      case x: Seq[Any]      => seqAsJava(anyAsJava)(x)
      case x: Set[Any]      => setAsJava(anyAsJava)(x)
      case x: Long          => longAsJava(x)
      case x: Int           => intAsJava(x)
      case x: Double        => doubleAsJava(x)
      case x                => x.asInstanceOf[AnyRef]
    }
  }
}

trait LowPriorityAsJava {
  implicit def otherAsJava[T]: AsJava[T, T] = new AsJava[T, T] {
    def apply(o: T): T = o
  }
}
用法:

Seq(Seq.empty[Int]).asJavaRecursive

谢谢你给我指点JavaConversions。我不知道这些。我还需要为我的方法做一些讨厌的Java泛型通配符,以使返回类型正常工作,例如,将我声明的返回类型转换为publicmap createMap()Nice!我还想知道您是否编写了一个类似的递归toScala函数..'',我最后编写的递归toScala函数是:def toScala(x:Any):Any={import collection.JavaConversions.\ux match{case y:java.util.Map[,\u]=>mapascalamap(y).Map{case(d,v)=>toScala(d)->toScala(v)}case y:java.lang Iterable[]=>IterablescaleAitaitatable(y).toList.map{item:Any=>toScala(item)}case y:java.util.Iterator[\u]=>toScala(y)case{u=>x}}'''',我分别在这里发布了一个相关问题:
Seq(Seq.empty[Int]).asJavaRecursive