Maven构建错误(Scala+;Spark):对象apache不是包组织的成员

Maven构建错误(Scala+;Spark):对象apache不是包组织的成员,scala,maven,apache-spark,log4j,Scala,Maven,Apache Spark,Log4j,我在StackOverflow和其他资源(,)上读了几个关于这个问题的帖子,但不幸的是,它没有帮助。 几乎所有这些都在SBT上描述了相同的问题,而不是Maven 我还检查了Spark文档()中的Scala/Spark兼容性,似乎我有正确的版本(Scala 2.11.8+Spark 2.2.0) 我将描述我的整个工作流程,因为我不确定哪些信息有助于确定根本原因 这就是我试图构建的代码 pom.xml: <?xml version="1.0" encoding="UTF-8"?> <

我在StackOverflow和其他资源(,)上读了几个关于这个问题的帖子,但不幸的是,它没有帮助。 几乎所有这些都在SBT上描述了相同的问题,而不是Maven

我还检查了Spark文档()中的Scala/Spark兼容性,似乎我有正确的版本(Scala 2.11.8+Spark 2.2.0)

我将描述我的整个工作流程,因为我不确定哪些信息有助于确定根本原因

这就是我试图构建的代码

pom.xml:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>org.example</groupId>
    <artifactId>SparkWordCount</artifactId>
    <version>1.0-SNAPSHOT</version>
    <name>SparkWordCount</name>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>

    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <testSourceDirectory>src/test/scala</testSourceDirectory>

        <plugins>
            <!-- mixed scala/java compile -->
            <plugin>
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.3.1</version>
                <executions>
                    <execution>
                        <id>compile</id>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                        <phase>compile</phase>
                    </execution>
                    <execution>
                        <id>test-compile</id>
                        <goals>
                            <goal>testCompile</goal>
                        </goals>
                        <phase>test-compile</phase>
                    </execution>
                    <execution>
                        <phase>process-resources</phase>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.5.1</version>
                <configuration>
                    <source>1.7</source>
                    <target>1.7</target>
                </configuration>
            </plugin>
            <!-- for fatjar -->
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>3.1.0</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                    <finalName>uber-SparkWordCount-1.0-SNAPSHOT</finalName>
                    <appendAssemblyId>false</appendAssemblyId>
                </configuration>
                <executions>
                    <execution>
                        <id>assemble-all</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-jar-plugin</artifactId>
                <version>3.2.0</version>
                <configuration>
                    <archive>
                        <manifest>
                            <addClasspath>true</addClasspath>
                            <mainClass>fully.qualified.MainClass</mainClass>
                        </manifest>
                    </archive>
                </configuration>
            </plugin>
        </plugins>
    </build>
    <dependencies>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.2.0</version>
            <scope>provided</scope>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.2.0</version>
            <scope>provided</scope>
        </dependency>
    </dependencies>
</project>


D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:3: error: object apache is not a member of package org
import org.apache.spark.sql.{DataFrame, SparkSession}
           ^
D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:4: error: object apache is not a member of package org
import org.apache.log4j.Logger
           ^
D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:5: error: object apache is not a member of package org
import org.apache.log4j.Level
           ^
D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:10: error: not found: value Logger
    Logger.getLogger("org").setLevel(Level.ERROR)
只要启动main()方法,它就可以正常工作:

然后我尝试使用SparkSQL数据帧:

import org.apache.spark.sql.{DataFrame, SparkSession}
import org.apache.log4j.Logger
import org.apache.log4j.Level

object SparkWordCount {
  def main(args: Array[String]) {

    Logger.getLogger("org").setLevel(Level.ERROR)

    val spark = SparkSession
      .builder()
      .appName("SparkSessionZipsExample")
      .master("local")
      .getOrCreate()

    val airports: DataFrame = spark.read.csv("C:\\Users\\Евгений\\Desktop\\DATALEARN\\airports.csv")
    airports.show()
  }
}
这段代码会抛出一个错误:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/Logger
    at SparkWordCount$.main(SparkWordCount.scala:10)
    at SparkWordCount.main(SparkWordCount.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.Logger
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 2 more

Process finished with exit code 1
我不确定它到底是如何工作的,但我通过更改运行配置并选中复选框“包含具有提供范围的依赖项”来解决了这个问题

在此之后,我的SparkSQL代码也可以正常工作:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
+---------+--------------------+-------------+-----+-------+--------+----------+
|      _c0|                 _c1|          _c2|  _c3|    _c4|     _c5|       _c6|
+---------+--------------------+-------------+-----+-------+--------+----------+
|IATA_CODE|             AIRPORT|         CITY|STATE|COUNTRY|LATITUDE| LONGITUDE|
|      ABE|Lehigh Valley Int...|    Allentown|   PA|    USA|40.65236| -75.44040|
|      ABI|Abilene Regional ...|      Abilene|   TX|    USA|32.41132| -99.68190|
|      ABQ|Albuquerque Inter...|  Albuquerque|   NM|    USA|35.04022|-106.60919|
|      ABR|Aberdeen Regional...|     Aberdeen|   SD|    USA|45.44906| -98.42183|
|      ABY|Southwest Georgia...|       Albany|   GA|    USA|31.53552| -84.19447|
|      ACK|Nantucket Memoria...|    Nantucket|   MA|    USA|41.25305| -70.06018|
|      ACT|Waco Regional Air...|         Waco|   TX|    USA|31.61129| -97.23052|
|      ACV|      Arcata Airport|Arcata/Eureka|   CA|    USA|40.97812|-124.10862|
|      ACY|Atlantic City Int...|Atlantic City|   NJ|    USA|39.45758| -74.57717|
|      ADK|        Adak Airport|         Adak|   AK|    USA|51.87796|-176.64603|
|      ADQ|      Kodiak Airport|       Kodiak|   AK|    USA|57.74997|-152.49386|
|      AEX|Alexandria Intern...|   Alexandria|   LA|    USA|31.32737| -92.54856|
|      AGS|Augusta Regional ...|      Augusta|   GA|    USA|33.36996| -81.96450|
|      AKN| King Salmon Airport|  King Salmon|   AK|    USA|58.67680|-156.64922|
|      ALB|Albany Internatio...|       Albany|   NY|    USA|42.74812| -73.80298|
|      ALO|Waterloo Regional...|     Waterloo|   IA|    USA|42.55708| -92.40034|
|      AMA|Rick Husband Amar...|     Amarillo|   TX|    USA|35.21937|-101.70593|
|      ANC|Ted Stevens Ancho...|    Anchorage|   AK|    USA|61.17432|-149.99619|
|      APN|Alpena County Reg...|       Alpena|   MI|    USA|45.07807| -83.56029|
+---------+--------------------+-------------+-----+-------+--------+----------+
only showing top 20 rows
但是,当我试图执行Maven“clean->package”命令时,我遇到了几个错误,而且都是错误 关于“org.apache”软件包:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>org.example</groupId>
    <artifactId>SparkWordCount</artifactId>
    <version>1.0-SNAPSHOT</version>
    <name>SparkWordCount</name>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>

    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <testSourceDirectory>src/test/scala</testSourceDirectory>

        <plugins>
            <!-- mixed scala/java compile -->
            <plugin>
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.3.1</version>
                <executions>
                    <execution>
                        <id>compile</id>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                        <phase>compile</phase>
                    </execution>
                    <execution>
                        <id>test-compile</id>
                        <goals>
                            <goal>testCompile</goal>
                        </goals>
                        <phase>test-compile</phase>
                    </execution>
                    <execution>
                        <phase>process-resources</phase>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.5.1</version>
                <configuration>
                    <source>1.7</source>
                    <target>1.7</target>
                </configuration>
            </plugin>
            <!-- for fatjar -->
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>3.1.0</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                    <finalName>uber-SparkWordCount-1.0-SNAPSHOT</finalName>
                    <appendAssemblyId>false</appendAssemblyId>
                </configuration>
                <executions>
                    <execution>
                        <id>assemble-all</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-jar-plugin</artifactId>
                <version>3.2.0</version>
                <configuration>
                    <archive>
                        <manifest>
                            <addClasspath>true</addClasspath>
                            <mainClass>fully.qualified.MainClass</mainClass>
                        </manifest>
                    </archive>
                </configuration>
            </plugin>
        </plugins>
    </build>
    <dependencies>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.2.0</version>
            <scope>provided</scope>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.2.0</version>
            <scope>provided</scope>
        </dependency>
    </dependencies>
</project>


D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:3: error: object apache is not a member of package org
import org.apache.spark.sql.{DataFrame, SparkSession}
           ^
D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:4: error: object apache is not a member of package org
import org.apache.log4j.Logger
           ^
D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:5: error: object apache is not a member of package org
import org.apache.log4j.Level
           ^
D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:10: error: not found: value Logger
    Logger.getLogger("org").setLevel(Level.ERROR)
这是有关我的环境的详细信息,其中一些可能会有所帮助:

  • 我使用Intellij IDEa

  • 视窗10

  • 已安装Java 8

  • Scala 2.11.8安装和运行:

  • Spark 2.2.0安装并通过Spark外壳工作

  • 我下载了WinUtils,在其中创建了“bin”文件夹,为hadoop_2.7.1复制WinUtils.exe并将其粘贴到“bin”:

  • 这是我的HADOOP_HOME和JAVA_HOME环境变量:

  • 我还设置了SPARK2_HOME环境变量(不确定我是否真的需要这样做):

  • 这就是我的道路:

之前我在路径中有HADOOP_主页,但我根据StackOverflow上一个相关线程的建议将其删除

提前感谢您的帮助

更新-1我的项目结构:

更新-2

如果这很重要:我没有MAVEN_HOME环境变量,因此在我的路径中没有MAVEN_HOME。我通过Intellij IDEa Maven接口执行“clean->package”

更新-3

项目结构中的库列表

更新-4 有关scala sdk的信息:

从pom.xml文件中删除提供的
。转到
文件
选项卡,单击
使缓存无效/重新启动
选项并重试


对于maven问题-try
mvn clean scala:compile-DdisplayCmd=true-DrecompileMode=all-package
命令。

我试图从spark core和spark sql依赖项中删除它。完全相同的结果也将此-
full.qualified.MainClass
更改为实际的主classI创建包com.jd2050.example并将我的类SparkWordCount.scala移到那里
com.jd2050.example.SparkWordCount
然后我在Intellij接口中创建了Maven运行配置:
clean scala:compile-DdisplayCmd=true-drecileMode=all-package
我还多次“使缓存失效并重新启动”。同样的结果:(你也可以发布你的项目结构图片吗?@Srinivas当然,UpdateGo to file->project structure->click on libraries。显示这个图片吗?很抱歉不是这样。转到文件顶部栏->project structure->project settings->libraries你的scala库有不同的版本,你能尝试添加相同的版本吗