Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/16.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/maven/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
无法解析scala中的依赖项_Scala_Maven_Pom.xml_Mvn Repo - Fatal编程技术网

无法解析scala中的依赖项

无法解析scala中的依赖项,scala,maven,pom.xml,mvn-repo,Scala,Maven,Pom.xml,Mvn Repo,我很难解决这个错误 [INFO] Scanning for projects... [INFO] [INFO] -----------------< com.bosch.us.dm.test:isbn-encoder >------------------ [INFO] Building isbn-encoder 0.0.1-SNAPSHOT [INFO] --------------------------------[ jar ]------------------------

我很难解决这个错误

[INFO] Scanning for projects...
[INFO] 
[INFO] -----------------< com.bosch.us.dm.test:isbn-encoder >------------------
[INFO] Building isbn-encoder 0.0.1-SNAPSHOT
[INFO] --------------------------------[ jar ]---------------------------------
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/spark/spark-core_2.1/2.1.1/spark-core_2.1-2.1.1.pom
[WARNING] The POM for org.apache.spark:spark-core_2.1:jar:2.1.1 is missing, no dependency information available
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/spark/spark-sql_2.1/2.1.1/spark-sql_2.1-2.1.1.pom
[WARNING] The POM for org.apache.spark:spark-sql_2.1:jar:2.1.1 is missing, no dependency information available
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/spark/spark-core_2.1/2.1.1/spark-core_2.1-2.1.1.jar
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/spark/spark-sql_2.1/2.1.1/spark-sql_2.1-2.1.1.jar
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  1.081 s
[INFO] Finished at: 2021-04-17T20:22:31-07:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project isbn-encoder: Could not resolve dependencies for project com.bosch.us.dm.test:isbn-encoder:jar:0.0.1-SNAPSHOT: The following artifacts could not be resolved: org.apache.spark:spark-core_2.1:jar:2.1.1, org.apache.spark:spark-sql_2.1:jar:2.1.1: Could not find artifact org.apache.spark:spark-core_2.1:jar:2.1.1 in central (https://repo.maven.apache.org/maven2) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[INFO]正在扫描项目。。。
[信息]
[信息]----------------------------------
[信息]构建isbn编码器0.0.1-SNAPSHOT
[信息]------------------------------------[jar]---------------------------------
从中环下载:https://repo.maven.apache.org/maven2/org/apache/spark/spark-core_2.1/2.1.1/spark-core_2.1-2.1.1.pom
[警告]org.apache.spark:spark-core_2.1:jar:2.1.1的POM丢失,没有可用的依赖信息
从中环下载:https://repo.maven.apache.org/maven2/org/apache/spark/spark-sql_2.1/2.1.1/spark-sql_2.1-2.1.1.pom
[警告]org.apache.spark:spark-sql_2.1:jar:2.1.1的POM丢失,没有可用的依赖项信息
从中环下载:https://repo.maven.apache.org/maven2/org/apache/spark/spark-core_2.1/2.1.1/spark-core_2.1-2.1.1.jar
从中环下载:https://repo.maven.apache.org/maven2/org/apache/spark/spark-sql_2.1/2.1.1/spark-sql_2.1-2.1.1.jar
[信息]------------------------------------------------------------------------
[信息]生成失败
[信息]------------------------------------------------------------------------
[信息]总时间:1.081秒
[信息]完成时间:2021-04-17T20:22:31-07:00
[信息]------------------------------------------------------------------------
[错误]无法在project isbn encoder上执行目标:无法解析project com.bosch.us.dm的依赖项。测试:isbn encoder:jar:0.0.1-SNAPSHOT:无法解析以下工件:org.apache.spark:spark-core_2.1:jar:2.1.1,org.apache.spark:spark-sql_2.1:jar:2.1.1:在central中找不到工件org.apache.spark:spark-core_2.1:jar:2.1.1(https://repo.maven.apache.org/maven2)->[帮助1]
[错误]
[错误]要查看错误的完整堆栈跟踪,请使用-e开关重新运行Maven。
[错误]使用-X开关重新运行Maven以启用完整调试日志记录。
[错误]
[错误]有关错误和可能的解决方案的更多信息,请阅读以下文章:
[错误][帮助1]http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
我正在使用maven进行编译。我曾尝试更改scala maven插件版本,但似乎没有帮助。我不太理解pom.xml文件,所以它很可能是我问题的根源。xml如下所示

<project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://maven.apache.org/POM/4.0.0"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">

    <modelVersion>4.0.0</modelVersion>
    <groupId>com.bosch.us.dm.test</groupId>
    <artifactId>isbn-encoder</artifactId>
    <version>0.0.1-SNAPSHOT</version>

    <properties>
        <spark.version>2.1.1</spark.version>
        <scala.dep.version>2.1</scala.dep.version>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_${scala.dep.version}</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_${scala.dep.version}</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.12</version>
            <scope>test</scope>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <!-- mixed scala/java compile -->
            <plugin>
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.3.3</version>
                <executions>
                    <execution>
                        <id>compile</id>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                        <phase>compile</phase>
                    </execution>
                    <execution>
                        <id>test-compile</id>
                        <goals>
                            <goal>testCompile</goal>
                        </goals>
                        <phase>test-compile</phase>
                    </execution>
                    <execution>
                        <phase>process-resources</phase>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.6.1</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
        </plugins>
    </build>

</project>

4.0.0
com.bosch.us.dm.test
isbn编码器
0.0.1-快照
2.1.1
2.1
UTF-8
org.apache.spark
spark-core{scala.dep.version}
${spark.version}
假如
org.apache.spark
spark-sql{scala.dep.version}
${spark.version}
假如
朱尼特
朱尼特
4.12
测试
net.alchim31.maven
scala maven插件
3.3.3
编译
编译
编译
测试编译
测试编译
测试编译
过程资源
编译
maven编译器插件
3.6.1
1.8
1.8

我读过一些建议删除jar文件的资料,但我不确定这些文件的位置。

如果内存可用,
~/.maven2
是依赖项缓存的默认位置。我在那里看到了Scala 2.1,但我想您可能想使用Scala 2.10或Scala 2.11,如果我没有错的话,这些是Spark 2.1.1的有效版本。也许这就解决了问题?