Apache spark Spark ClassNotFoundException:未能找到数据源
我正在尝试配置一个独立的spark集群,以部署一些python代码来从Kinesis流读取数据。Apache spark Spark ClassNotFoundException:未能找到数据源,apache-spark,pyspark,apache-spark-sql,spark-structured-streaming,Apache Spark,Pyspark,Apache Spark Sql,Spark Structured Streaming,我正在尝试配置一个独立的spark集群,以部署一些python代码来从Kinesis流读取数据。 我可以启动集群并通过本地浏览访问它,甚至可以打开pyspark控制台 但是,当我试图通过PyCharm部署python应用程序时,我遇到以下异常: The jars for the packages stored in: /Users/mdebarros/.ivy2/jars :: loading settings :: url = jar:file:/usr/local/Cellar/ap
我可以启动集群并通过本地浏览访问它,甚至可以打开pyspark控制台 但是,当我试图通过PyCharm部署python应用程序时,我遇到以下异常:
The jars for the packages stored in: /Users/mdebarros/.ivy2/jars
:: loading settings :: url = jar:file:/usr/local/Cellar/apache-spark/3.0.0/libexec/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.apache.spark#spark-streaming-kinesis-asl_2.12 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-87ba183e-afae-407d-b1f3-5e749ee2f279;1.0
confs: [default]
found org.apache.spark#spark-streaming-kinesis-asl_2.12;3.0.0 in central
found com.amazonaws#amazon-kinesis-client;1.12.0 in central
found com.amazonaws#aws-java-sdk-dynamodb;1.11.655 in central
found com.amazonaws#aws-java-sdk-s3;1.11.655 in central
found com.amazonaws#aws-java-sdk-kms;1.11.655 in central
found com.amazonaws#aws-java-sdk-core;1.11.655 in central
found commons-logging#commons-logging;1.1.3 in central
found org.apache.httpcomponents#httpclient;4.5.6 in central
found org.apache.httpcomponents#httpcore;4.4.12 in central
found commons-codec#commons-codec;1.10 in central
found software.amazon.ion#ion-java;1.0.2 in central
found com.fasterxml.jackson.core#jackson-databind;2.10.0 in central
found com.fasterxml.jackson.core#jackson-annotations;2.10.0 in central
found com.fasterxml.jackson.core#jackson-core;2.10.0 in central
found joda-time#joda-time;2.10.5 in central
found com.amazonaws#jmespath-java;1.11.655 in central
found com.amazonaws#aws-java-sdk-kinesis;1.11.655 in central
found com.amazonaws#aws-java-sdk-cloudwatch;1.11.655 in central
found com.google.protobuf#protobuf-java;2.5.0 in central
found org.apache.commons#commons-lang3;3.9 in central
found com.amazonaws#aws-java-sdk-sts;1.11.655 in central
found com.fasterxml.jackson.dataformat#jackson-dataformat-cbor;2.10.0 in central
found org.spark-project.spark#unused;1.0.0 in central
:: resolution report :: resolve 400ms :: artifacts dl 10ms
:: modules in use:
com.amazonaws#amazon-kinesis-client;1.12.0 from central in [default]
com.amazonaws#aws-java-sdk-cloudwatch;1.11.655 from central in [default]
com.amazonaws#aws-java-sdk-core;1.11.655 from central in [default]
com.amazonaws#aws-java-sdk-dynamodb;1.11.655 from central in [default]
com.amazonaws#aws-java-sdk-kinesis;1.11.655 from central in [default]
com.amazonaws#aws-java-sdk-kms;1.11.655 from central in [default]
com.amazonaws#aws-java-sdk-s3;1.11.655 from central in [default]
com.amazonaws#aws-java-sdk-sts;1.11.655 from central in [default]
com.amazonaws#jmespath-java;1.11.655 from central in [default]
com.fasterxml.jackson.core#jackson-annotations;2.10.0 from central in [default]
com.fasterxml.jackson.core#jackson-core;2.10.0 from central in [default]
com.fasterxml.jackson.core#jackson-databind;2.10.0 from central in [default]
com.fasterxml.jackson.dataformat#jackson-dataformat-cbor;2.10.0 from central in [default]
com.google.protobuf#protobuf-java;2.5.0 from central in [default]
commons-codec#commons-codec;1.10 from central in [default]
commons-logging#commons-logging;1.1.3 from central in [default]
joda-time#joda-time;2.10.5 from central in [default]
org.apache.commons#commons-lang3;3.9 from central in [default]
org.apache.httpcomponents#httpclient;4.5.6 from central in [default]
org.apache.httpcomponents#httpcore;4.4.12 from central in [default]
org.apache.spark#spark-streaming-kinesis-asl_2.12;3.0.0 from central in [default]
org.spark-project.spark#unused;1.0.0 from central in [default]
software.amazon.ion#ion-java;1.0.2 from central in [default]
:: evicted modules:
com.fasterxml.jackson.dataformat#jackson-dataformat-cbor;2.6.7 by [com.fasterxml.jackson.dataformat#jackson-dataformat-cbor;2.10.0] in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 24 | 0 | 0 | 1 || 23 | 0 |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent-87ba183e-afae-407d-b1f3-5e749ee2f279
confs: [default]
0 artifacts copied, 23 already retrieved (0kB/10ms)
20/08/27 14:35:30 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Traceback (most recent call last):
File "/Users/croacoras/PycharmProjects/BODStreaming/app/structured_bods.py", line 175, in <module>
.option('awsSecretKey', '###############################')\
File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/pyspark/sql/streaming.py", line 420, in load
return self._df(self._jreader.load())
File "/Users/cracoras/.virtualenvs/BODStreaming/lib/python3.7/site-packages/py4j/java_gateway.py", line 1305, in __call__
answer, self.gateway_client, self.target_id, self.name)
File "/usr/local/Cellar/apache-spark/3.0.0/libexec/python/pyspark/sql/utils.py", line 131, in deco
return f(*a, **kw)
File "/Users/cracoras/.virtualenvs/BODStreaming/lib/python3.7/site-packages/py4j/protocol.py", line 328, in get_return_value
format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling o39.load.
: java.lang.ClassNotFoundException: Failed to find data source: kinesis. Please find packages at http://spark.apache.org/third-party-projects.html
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:674)
at org.apache.spark.sql.streaming.DataStreamReader.load(DataStreamReader.scala:194)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: kinesis.DefaultSource
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.apache.spark.sql.execution.datasources.DataSource$.$anonfun$lookupDataSource$5(DataSource.scala:648)
at scala.util.Try$.apply(Try.scala:213)
at org.apache.spark.sql.execution.datasources.DataSource$.$anonfun$lookupDataSource$4(DataSource.scala:648)
at scala.util.Failure.orElse(Try.scala:224)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:648)
... 12 more
Process finished with exit code 1
在我的python项目中,我还导入了pyspark 3.0.0
总体配置为:
火花:3.0.0
Python:3.7.6
斯卡拉:2.12
谢谢您的帮助。您能传递--packages org.apache.spark:spark-streaming-kinesis-asl_2.12:3.0.0-preview吗?hi@prabhakarredy中提到过,在我现有的配置中,我有相同的包,没有“preview”标签。但是,我用你建议的-preview替换了,得到了相同的错误。你能传递--packages org.apache.spark:spark-streaming-kinesis-asl_2.12:3.0.0-preview吗?hi@prabhakarredy中提到的,在我现有的配置中,我有相同的包,没有“preview”标签。然而,我用你建议的-preview替换了它,得到了同样的错误。
/external/spark-streaming-kinesis-asl_2.12-3.0.0.jar
spark.jars.packages org.apache.spark:spark-streaming-kinesis-asl_2.12:3.0.0