Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 如何在SBT中使用火花测试罐_Apache Spark_Intellij Idea_Sbt - Fatal编程技术网

Apache spark 如何在SBT中使用火花测试罐

Apache spark 如何在SBT中使用火花测试罐,apache-spark,intellij-idea,sbt,Apache Spark,Intellij Idea,Sbt,我正在创建Spark 2.0.1项目,希望在我的SBT项目中使用Spark测试JAR build.sbt: scalaVersion := "2.11.0" val sparkVersion = "2.0.1" libraryDependencies ++= Seq( "org.apache.spark" %% "spark-core" % sparkVersion % "compile", "org.apache.spark" %% "spark-sql" % sparkVe

我正在创建Spark 2.0.1项目,希望在我的SBT项目中使用Spark测试JAR

build.sbt:

scalaVersion := "2.11.0"
val sparkVersion = "2.0.1"

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % sparkVersion % "compile",
    "org.apache.spark" %% "spark-sql" % sparkVersion % "compile",
    "org.scalatest" %% "scalatest" % "2.2.6" % "test",
    "org.apache.spark" %% "spark-core" % sparkVersion % "test" classifier "tests",
    "org.apache.spark" %% "spark-sql" % sparkVersion % "test" classifier "tests",
    "org.apache.spark" %% "spark-catalyst" % sparkVersion % "test" classifier "tests"
)
我的测试代码:

import org.apache.spark.sql.DataFrame
import org.apache.spark.sql.functions._
import org.apache.spark.sql.test.SharedSQLContext

class LoaderTest extends org.apache.spark.sql.QueryTest with SharedSQLContext {
  import testImplicits._

    test("function current_date") {
      val df1 = Seq((1, 2), (3, 1)).toDF("a", "b")
      // Rest of test code and assertion using checkAnswer method
    }
}
但当我尝试使用以下方法运行测试时:

sbt clean test
它会出现以下错误:

[info] Compiling 1 Scala source to /tstprg/test/target/scala-2.11/test-classes...
[error] bad symbolic reference to org.apache.spark.sql.catalyst.expressions.PredicateHelper encountered in class file 'PlanTest.class'.
[error] Cannot access type PredicateHelper in package org.apache.spark.sql.catalyst.expressions. The current classpath may be
[error] missing a definition for org.apache.spark.sql.catalyst.expressions.PredicateHelper, or PlanTest.class may have been compiled against a version that's
[error] incompatible with the one found on the current classpath.
[error] /tstprg/test/src/test/scala/facts/LoaderTest.scala:7: illegal inheritance;
[error]  self-type facts.LoaderTest does not conform to org.apache.spark.sql.QueryTest's selftype org.apache.spark.sql.QueryTest
[error]     class LoaderTest extends org.apache.spark.sql.QueryTest with SharedSQLContext {
[error]                                                   ^
[error] /tstprg/test/src/test/scala/facts/LoaderTest.scala:7: illegal inheritance;
[error]  self-type facts.LoaderTest does not conform to org.apache.spark.sql.test.SharedSQLContext's selftype org.apache.spark.sql.test.SharedSQLContext
[error]     class LoaderTest extends org.apache.spark.sql.QueryTest with SharedSQLContext {
[error]                                                                  ^
[error] bad symbolic reference to org.apache.spark.sql.Encoder encountered in class file 'SQLImplicits.class'.
[error] Cannot access type Encoder in package org.apache.spark.sql. The current classpath may be
[error] missing a definition for org.apache.spark.sql.Encoder, or SQLImplicits.class may have been compiled against a version that's
[error] incompatible with the one found on the current classpath.
[error] /tstprg/test/src/test/scala/facts/LoaderTest.scala:11: bad symbolic reference to org.apache.spark.sql.catalyst.plans.logical encountered in class file 'SQLTestUtils.class'.
[error] Cannot access term logical in package org.apache.spark.sql.catalyst.plans. The current classpath may be
[error] missing a definition for org.apache.spark.sql.catalyst.plans.logical, or SQLTestUtils.class may have been compiled against a version that's
[error] incompatible with the one found on the current classpath.
[error]           val df1 = Seq((1, 2), (3, 1)).toDF("a", "b")
[error]                                         ^
[error] 5 errors found
[error] (test:compileIncremental) Compilation failed
有谁尝试过使用spark的测试罐来使用SBT进行单元测试,能帮我解决我所缺少的问题吗


注意:当我运行IntelliJ IDE时,此测试工作正常。

尝试更改标记为测试的依赖项的范围,如下所示

scalaVersion := "2.11.0"
val sparkVersion = "2.0.1"

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % sparkVersion,
    "org.apache.spark" %% "spark-sql" % sparkVersion,
    "org.scalatest" %% "scalatest" % "2.2.6",
    "org.apache.spark" %% "spark-core" % sparkVersion ,
    "org.apache.spark" %% "spark-sql" % sparkVersion,
    "org.apache.spark" %% "spark-catalyst" % sparkVersion
)

或者添加“compile”。

尝试使用下面提到的范围


version := "0.1"

scalaVersion := "2.11.11"
val sparkVersion = "2.3.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion % Provided,
  "org.apache.spark" %% "spark-core" % sparkVersion % Test classifier "tests",
  "org.apache.spark" %% "spark-core" % sparkVersion % Test classifier "test-sources",
  "org.apache.spark" %% "spark-sql" % sparkVersion % Provided,
  "org.apache.spark" %% "spark-sql" % sparkVersion % Test classifier "tests",
  "org.apache.spark" %% "spark-sql" % sparkVersion % Test classifier "test-sources",
  "org.apache.spark" %% "spark-catalyst" % sparkVersion % Test classifier "tests",
  "org.apache.spark" %% "spark-catalyst" % sparkVersion % Test classifier "test-sources",
  "com.typesafe.scala-logging" %% "scala-logging" % "3.9.0",
  "org.scalatest" %% "scalatest" % "3.0.4" % "test",
  "org.typelevel" %% "cats-core" % "1.1.0",
  "org.typelevel" %% "cats-effect" % "1.0.0-RC2",
  "org.apache.spark" %% "spark-streaming" % sparkVersion % Provided,
  "org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion % Provided exclude ("net.jpountz.lz4", "lz4"),
  "com.pusher" % "pusher-java-client" % "1.8.0") ```

谢谢。但是,我试着将范围设置为“测试”和“测试->测试”,但仍然无法使其运行。添加了详细的回答它不起作用,因为我们需要spark core、spark sql和spark catalyst的测试罐。