Scala normalvectordd上的操作

Scala normalvectordd上的操作,scala,apache-spark,vector,rdd,normal-distribution,Scala,Apache Spark,Vector,Rdd,Normal Distribution,我想用我自己的平均值和西格玛创建一个RDD[Vector],我已经这样做了: val mean = Random.nextInt(100) val sigma = 2 val data: RDD[Vector] = RandomRDDs.normalVectorRDD(sc, numRows = 180, numCols = 20).map(v => mean + sigma * v) 但我有以下错误: overloaded method value * with alternative

我想用我自己的平均值和西格玛创建一个RDD[Vector],我已经这样做了:

val mean = Random.nextInt(100)
val sigma = 2
val data: RDD[Vector] = RandomRDDs.normalVectorRDD(sc, numRows = 180, numCols = 20).map(v => mean + sigma * v)
但我有以下错误:

overloaded method value * with alternatives:
  (x: Double)Double <and>
  (x: Float)Float <and>
  (x: Long)Long <and>
  (x: Int)Int <and>
  (x: Char)Int <and>
  (x: Short)Int <and>
  (x: Byte)Int
 cannot be applied to (org.apache.spark.mllib.linalg.Vector)
      val data: RDD[Vector] = RandomRDDs.normalVectorRDD(sc, numRows = 180, numCols = 20).map(v => mean + sigma * v)

重载的方法值*及其替代项:
双倍
(x:浮动)浮动
(十:长)长
(x:Int)Int
(x:Char)Int
(x:Short)Int
(x:Byte)Int
无法应用于(org.apache.spark.mllib.linalg.Vector)
val数据:RDD[Vector]=RandomRDDs.normalvectordd(sc,numRows=180,numCols=20).map(v=>mean+sigma*v)
我不理解这个错误,因为在scala文档中,他们也做了RandomRDDs.normal(sc,n,p,seed).map(lambda v:mean+sigma*v)


谢谢Spark医生。引用.normal()方法:

这实际上运行正常

如果需要将变换应用于向量:

val data0 = 
  RandomRDDs.normalVectorRDD(spark.sparkContext, numRows = 180, numCols = 20).map(v => v.toArray.map(v => mean + sigma * v))
val data0 = 
  RandomRDDs.normalVectorRDD(spark.sparkContext, numRows = 180, numCols = 20).map(v => v.toArray.map(v => mean + sigma * v))