Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 使用Maven时Intellij上Spark(DataTypeConversions.scala)中的编译错误_Apache Spark - Fatal编程技术网

Apache spark 使用Maven时Intellij上Spark(DataTypeConversions.scala)中的编译错误

Apache spark 使用Maven时Intellij上Spark(DataTypeConversions.scala)中的编译错误,apache-spark,Apache Spark,从2014年7月30日起,我无法在Intellij中编译Spark head。有人面对这个问题/发现了解决办法吗 Error:scalac: while compiling: /d/funcs/sql/core/src/main/scala/org/apache/spark/sql/types/util/DataTypeConversions.scala during phase: jvm library version: version 2.10.4

从2014年7月30日起,我无法在Intellij中编译Spark head。有人面对这个问题/发现了解决办法吗

Error:scalac: 
     while compiling: /d/funcs/sql/core/src/main/scala/org/apache/spark/sql/types/util/DataTypeConversions.scala
        during phase: jvm
     library version: version 2.10.4
    compiler version: version 2.10.4
  reconstructed args: -classpath :/shared/jdk1.7.0_25/jre/classes:/home/steve/.m2/repository/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar
  last tree to typer: Literal(Constant(org.apache.spark.sql.catalyst.types.PrimitiveType))
              symbol: null
   symbol definition: null
                 tpe: Class(classOf[org.apache.spark.sql.catalyst.types.PrimitiveType])
       symbol owners: 
      context owners: anonymous class anonfun$asScalaDataType$1 -> package util
== Enclosing template or block ==
Template( // val <local $anonfun>: <notype>, tree.tpe=org.apache.spark.sql.types.util.anonfun$asScalaDataType$1
  "scala.runtime.AbstractFunction1", "scala.Serializable" // parents
  ValDef(
    private
    "_"
    <tpt>
    <empty>
  )
  // 3 statements
  DefDef( // final def apply(javaStructField: org.apache.spark.sql.api.java.StructField): org.apache.spark.sql.catalyst.types.StructField
    <method> final <triedcooking>
    "apply"
    []
    // 1 parameter list
    ValDef( // javaStructField: org.apache.spark.sql.api.java.StructField
      <param> <synthetic> <triedcooking>
      "javaStructField"
      <tpt> // tree.tpe=org.apache.spark.sql.api.java.StructField
      <empty>
    )
    <tpt> // tree.tpe=org.apache.spark.sql.catalyst.types.StructField
    Apply( // def asScalaStructField(javaStructField: org.apache.spark.sql.api.java.StructField): org.apache.spark.sql.catalyst.types.StructField in object DataTypeConversions, tree.tpe=org.apache.spark.sql.catalyst.types.StructField
      DataTypeConversions.this."asScalaStructField" // def asScalaStructField(javaStructField: org.apache.spark.sql.api.java.StructField): org.apache.spark.sql.catalyst.types.StructField in object DataTypeConversions, tree.tpe=(javaStructField: org.apache.spark.sql.api.java.StructField)org.apache.spark.sql.catalyst.types.StructField
      "javaStructField" // javaStructField: org.apache.spark.sql.api.java.StructField, tree.tpe=org.apache.spark.sql.api.java.StructField
    )
  )
  DefDef( // final def apply(v1: Object): Object
    <method> final <bridge>
    "apply"
    []
    <snip>
        DataTypeConversions$$anonfun$asScalaDataType$1.super."<init>" // def <init>(): scala.runtime.AbstractFunction1 in class AbstractFunction1, tree.tpe=()scala.runtime.AbstractFunction1
        Nil
      )
      ()
    )
  )
)
== Expanded type of tree ==
ConstantType(
  value = Constant(org.apache.spark.sql.catalyst.types.PrimitiveType)
)
uncaught exception during compilation: java.lang.AssertionError
错误:scalac:
编译时:/d/funcs/sql/core/src/main/scala/org/apache/spark/sql/types/util/DataTypeConversions.scala
阶段:jvm
库版本:2.10.4版
编译器版本:2.10.4版
重构参数:-classpath:/shared/jdk1.7.0_25/jre/classes:/home/steve/.m2/repository/org/scala lang/scala library/2.10.4/scala-library-2.10.4.jar
typer的最后一个树:Literal(常量(org.apache.spark.sql.catalyst.types.PrimitiveType))
符号:空
符号定义:空
tpe:Class(classOf[org.apache.spark.sql.catalyst.types.PrimitiveType])
符号所有者:
上下文所有者:匿名类anonfun$AsscalDataType$1->package util
==封闭模板或块==
模板(//val:,tree.tpe=org.apache.spark.sql.types.util.anonfun$asScalaDataType$1
“scala.runtime.AbstractFunction1”、“scala.Serializable”//parents
瓦尔德夫(
私有的
"_"
)
//3项声明
DefDef(//final def apply(javaStructField:org.apache.spark.sql.api.java.StructField):org.apache.spark.sql.catalyst.types.StructField
最终的
“应用”
[]
//1参数表
ValDef(//javaStructField:org.apache.spark.sql.api.java.StructField
“javaStructField”
//tree.tpe=org.apache.spark.sql.api.java.StructField
)
//tree.tpe=org.apache.spark.sql.catalyst.types.StructField
在对象DataTypeConversions的tree.tpe=org.apache.spark.sql.api.java.StructField中应用(//def asScalaStructField(javaStructField:org.apache.spark.sql.api.java.StructField):org.apache.spark.sql.catalyst.types.StructField
DataTypeConversions.this.“asScalaStructField”//def asScalaStructField(javaStructField:org.apache.spark.sql.api.java.StructField):对象DataTypeConversions中的org.apache.spark.sql.catalyst.types.StructField,tree.tpe=(javaStructField:org.apache.spark.api.java.StructField)org.apache.spark.sql.catalyst.types
“javaStructField”//javaStructField:org.apache.spark.sql.api.java.StructField,tree.tpe=org.apache.spark.sql.api.java.StructField
)
)
DefDef(//最终def应用(v1:Object):对象
最终的
“应用”
[]
DataTypeConversions$$anonfun$AsscalDataType$1.super。“//def():类AbstractFunction1中的scala.runtime.AbstractFunction1,tree.tpe=()scala.runtime.AbstractFunction1
无
)
()
)
)
)
==树的扩展类型==
康斯坦特型(
value=Constant(org.apache.spark.sql.catalyst.types.PrimitiveType)
)
编译期间未捕获的异常:java.lang.AssertionError
根据:

尝试运行:
sbt clean

我求助于递归删除intellij的所有痕迹

find . -name \*.iml | xargs rm -f
然后从根/父目录中的pom.xml开始。事情又起了作用


intellij.iml文件似乎有一些奇怪的状态/损坏

如果我使用的是sbt,那么你的建议是很好的——但是我使用的是maven(也在IJ中),不管怎样,我在构建项目之前调用了maven“clean”。它仍然给出相同的错误。注意:使用maven(不使用clean)直到大约10天前才开始工作。也许只需尝试sbt clean,然后继续使用maven?这就是我现在所知道的。谢谢你在这里尝试。注意:当使用SparkBuild.scala而不是maven时,运行“sbt clean”是解决上述问题的好方法。我将投票支持“有用”的观点,以供未来关注该问题的人使用。您可能指的是“find.-name*.iml | xargs rm”(缺少xargs)是的;)谢谢updatedmvn clean在我的case@kk1957你的情况就不一样了:
mvn clean
已经在命令行上执行了,IJ内部也已经执行了
Build | Rebuild
。不知道为什么要关闭它——这对我来说似乎是一个非常清楚的问题。