Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/maven/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何在Eclipse中向Maven项目添加Spark?_Eclipse_Maven_Spark Java - Fatal编程技术网

如何在Eclipse中向Maven项目添加Spark?

如何在Eclipse中向Maven项目添加Spark?,eclipse,maven,spark-java,Eclipse,Maven,Spark Java,我想使用Maven在Eclipse中启动项目。 我已经安装了m2eclipse,在我的Maven项目中有一个工作的HelloWorldJava应用程序 我想使用Spark框架,我遵循的方向是。 我已将Spark存储库添加到我的pom.xml: <repository> <id>Spark repository</id> <url>http://www.sparkjava.com/nexus/content/repositor

我想使用Maven在Eclipse中启动项目。 我已经安装了m2eclipse,在我的Maven项目中有一个工作的HelloWorldJava应用程序

我想使用Spark框架,我遵循的方向是。 我已将Spark存储库添加到我的
pom.xml

<repository>
      <id>Spark repository</id>
      <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url>
</repository>
我如何解决这个问题?我不想下载Spark的jar文件并放在本地存储库中

这是我的pom.xml文件:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>com.myproject</groupId>
  <artifactId>Spark1</artifactId>
  <version>1.0-SNAPSHOT</version>
  <packaging>jar</packaging>

  <name>Spark1</name>
  <url>http://maven.apache.org</url>

  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
  </properties>

  <repository>
      <id>Spark repository</id>
      <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url>
  </repository>

  <dependencies>
<!--     (...) -->

    <dependency>
      <groupId>spark</groupId>
      <artifactId>spark</artifactId>
      <version>0.9.9.4-SNAPSHOT</version>
    </dependency>

  </dependencies>

</project>

4.0.0
com.myproject
Spark1
1.0-快照
罐子
Spark1
http://maven.apache.org
UTF-8
火花库
http://www.sparkjava.com/nexus/content/repositories/spark/
火花
火花
0.9.9.4-快照

需要将
存储库
块包装在
存储库
块中:

<repositories>
    <repository>
        <id>Spark repository</id>
        <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url>
    </repository>
</repositories>

火花库
http://www.sparkjava.com/nexus/content/repositories/spark/

我遇到了同样的问题,因为最初我为spark使用了不同的存储库url,然后为了使用早期版本,我更改了存储库url。有些人认为,在我更改存储库id之前,它似乎没有生效。请尝试更改存储库id。

可能是maven中的错误,因为从控制台运行maven也无法在不更新id的情况下解决依赖关系。

目前,Spark库加载不需要添加存储库

你只需要添加

<dependency>
    <groupId>com.sparkjava</groupId>
    <artifactId>spark-core</artifactId>
    <version>2.6.0</version>
</dependency>

com.sparkjava

请添加存储库,在存储库标签内添加标签,如下所示

<repositories>
        <repository>
            <id>Spark repository</id>
            <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url>
        </repository>
    </repositories>

火花库
http://www.sparkjava.com/nexus/content/repositories/spark/

故障原因是0.9.9.4-快照不可用。下面是可用快照的列表。根据您的要求使用其中的一个

0.9.8-SNAPSHOT/Sat 2011年5月21日21:54:23 UTC
0.9.9-SNAPSHOT/Mon May 23 10:57:38 UTC 2011
0.9.9.1-SNAPSHOT/Thu May 26 09:47:03 UTC 2011
0.9.9.3-SNAPSHOT/Thu Sep 01 07:53:59 UTC 2011

谢谢, Sankara Reddy

Spark的最新版本(2.1及更高版本)只需要pom.xml文件中定义的依赖项

<dependency>
    <groupId>com.sparkjava</groupId>
    <artifactId>spark-core</artifactId>
    <version>2.1</version>
</dependency>

com.sparkjava
火花芯
2.1

不再需要存储库定义

使用此最新的存储库。


org.apache.spark
spark-core_2.10
1.6.0

使用此选项,并确保在eclipse项目构建路径中将spark库更改为2.11.x版

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.spark-scala</groupId>
    <artifactId>spark-scala</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>${project.artifactId}</name>
    <description>Spark in Scala</description>
    <inceptionYear>2010</inceptionYear>

    <properties>
        <maven.compiler.source>1.8</maven.compiler.source>
        <maven.compiler.target>1.8</maven.compiler.target>
        <encoding>UTF-8</encoding>
        <scala.tools.version>2.10</scala.tools.version>
        <!-- Put the Scala version of the cluster -->
        <scala.version>2.10.4</scala.version>
    </properties>

    <!-- repository to add org.apache.spark -->
    <repositories>
        <repository>
            <id>cloudera-repo-releases</id>
            <url>https://repository.cloudera.com/artifactory/repo/</url>
        </repository>
    </repositories>

    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <testSourceDirectory>src/test/scala</testSourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.2.1</version>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.scala-tools</groupId>
                <artifactId>maven-scala-plugin</artifactId>
            </plugin>
        </plugins>
    </build>

    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>1.2.1</version>
        </dependency>
    </dependencies>
</project>

4.0.0
com.spark-scala
火花鳞片
0.0.1-快照
${project.artifactId}
斯卡拉星火
2010
1.8
1.8
UTF-8
2.10
2.10.4
cloudera回购协议发布
https://repository.cloudera.com/artifactory/repo/
src/main/scala
src/test/scala
net.alchim31.maven
scala maven插件
3.2.1
org.apache.maven.plugins
maven surefire插件
2.13
假的
真的
**/*测试*
**/*套房*
maven汇编插件
2.4.1
带有依赖项的jar
组装
包裹
单一的
org.scala-tools
maven scala插件
org.apache.spark
spark-core_2.11
1.2.1

您能给我们看一下您的整个
pom.xml
?您可以省略其他依赖项,但我希望看到其结构。还要确保您的Eclipse项目确实启用了Maven(右键单击project=>
Configure
=>
Convert to Maven project
)。@André我已经在描述中添加了pom.xml文件。这实际上对我很有效。。我所做的只是将id从“Spark repository”更改为“Spark repository”。
<dependency>
    <groupId>com.sparkjava</groupId>
    <artifactId>spark-core</artifactId>
    <version>2.1</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>1.6.0</version>
</dependency>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.spark-scala</groupId>
    <artifactId>spark-scala</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>${project.artifactId}</name>
    <description>Spark in Scala</description>
    <inceptionYear>2010</inceptionYear>

    <properties>
        <maven.compiler.source>1.8</maven.compiler.source>
        <maven.compiler.target>1.8</maven.compiler.target>
        <encoding>UTF-8</encoding>
        <scala.tools.version>2.10</scala.tools.version>
        <!-- Put the Scala version of the cluster -->
        <scala.version>2.10.4</scala.version>
    </properties>

    <!-- repository to add org.apache.spark -->
    <repositories>
        <repository>
            <id>cloudera-repo-releases</id>
            <url>https://repository.cloudera.com/artifactory/repo/</url>
        </repository>
    </repositories>

    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <testSourceDirectory>src/test/scala</testSourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.2.1</version>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.scala-tools</groupId>
                <artifactId>maven-scala-plugin</artifactId>
            </plugin>
        </plugins>
    </build>

    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>1.2.1</version>
        </dependency>
    </dependencies>
</project>