Mongodb Can';t通过Spark连接到Mongo DB

Mongodb Can';t通过Spark连接到Mongo DB,mongodb,python-2.7,apache-spark,pyspark,apache-spark-sql,Mongodb,Python 2.7,Apache Spark,Pyspark,Apache Spark Sql,我正试图通过ApacheSpark主机从Mongo DB读取数据 为此,我使用了3台机器: M1-上面有一个Mongo DB实例 M2-带Spark Master和Mongo连接器,在其上运行 M3-使用连接到M2的Spark master的python应用程序 应用程序(M3)与spark master的连接如下: _sparkSession = SparkSession.builder.master(masterPath).appName(appName)\ .config("spark

我正试图通过ApacheSpark主机从Mongo DB读取数据

为此,我使用了3台机器:

  • M1-上面有一个Mongo DB实例
  • M2-带Spark Master和Mongo连接器,在其上运行
  • M3-使用连接到M2的Spark master的python应用程序
应用程序(M3)与spark master的连接如下:

_sparkSession = SparkSession.builder.master(masterPath).appName(appName)\
.config("spark.mongodb.input.uri", "mongodb://10.0.3.150/db1.data.coll")\
.config("spark.mongodb.output.uri", "mongodb://10.0.3.150/db1.data.coll").getOrCreate()
应用程序(M3)正在尝试从数据库读取数据:

sqlContext = SQLContext(_sparkSession.sparkContext)
        df = sqlContext.read.format("com.mongodb.spark.sql.DefaultSource").option("uri","mongodb://user:pass@10.0.3.150/db1.data?readPreference=primaryPreferred").load()
但在这个例外情况下失败:

    py4j.protocol.Py4JJavaError: An error occurred while calling o56.load.
: java.lang.ClassNotFoundException: Failed to find data source: com.mongodb.spark.sql.DefaultSource. Please find packages at http://spark.apache.org/third-party-projects.html
        at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:594)
        at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:86)
        at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:86)
        at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:325)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:280)
        at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
        at py4j.commands.CallCommand.execute(CallCommand.java:79)
        at py4j.GatewayConnection.run(GatewayConnection.java:214)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com.mongodb.spark.sql.DefaultSource.DefaultSource
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:579)
        at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:579)
        at scala.util.Try$.apply(Try.scala:192)
        at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:579)
        at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:579)
        at scala.util.Try.orElse(Try.scala:84)
        at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:579)
        ... 16 more

Spark找不到
com.mongodb.Spark.sql.DefaultSource
包,因此出现错误消息

其他一切看起来都不错,只需包括Mongo Spark套装:

> $SPARK_HOME/bin/pyspark --packages org.mongodb.spark:mongo-spark-connector_2.11:2.2.0
或者确保jar文件位于正确的路径上


请确保检查您的Spark版本所需的Mongo Spark软件包版本:

我是pyspark用户,下面是我的代码,它可以工作:

pyspark中的MongoDB连接配置
从pyspark.sql导入SparkSession
火花=火花会话\
建筑商先生\
.master(“本地”)\
.config('spark.mongodb.input.uri','mongodb://user:password@ip.x.x.x:27017/database01.data.coll')\
.config('spark.mongodb.output.uri','mongodb://user:password@ip.x.x.x:27017/database01.data.coll')\
.config('spark.jars.packages','org.mongodb.spark:mongo-spark-connector_2.11:2.3.1')\
.getOrCreate()
阅读MongoDB:
df01=spark.read\
.format(“com.mongodb.spark.sql.DefaultSource”)\
.选项(“数据库”、“数据库01”)\
.选项(“集合”、“集合01”)\
.load()
写信给MongoDB:
df01.write.format(“com.mongodb.spark.sql.DefaultSource”)\
.mode(“覆盖”)\
.选项(“数据库”、“数据库01”)\
.选项(“集合”、“集合02”)\
.save()

我在配置Spark与CosmosDB(API MongoDB)的连接时经历了一段相当艰难的时间,因此我决定发布对我有用的代码作为贡献

我通过Databricks笔记本使用Spark 2.4.0

from pyspark.sql import SparkSession

# Connect to CosmosDB to write on the collection
userName = "userName"
primaryKey = "myReadAndWritePrimaryKey"
host = "ipAddress"
port = "10255"
database = "dbName"
collection = "collectionName"

# Structure the connection
connectionString = "mongodb://{0}:{1}@{2}:{3}/{4}.{5}?ssl=true&replicaSet=globaldb".format(userName, primaryKey, host, port, database, collection)

spark = SparkSession\
    .builder\
    .config('spark.mongodb.input.uri', connectionString)\
    .config('spark.mongodb.output.uri', connectionString)\
    .config('spark.jars.packages', 'org.mongodb.spark:mongo-spark-connector_2.11:2.3.1')\
    .getOrCreate()

# Reading from CosmosDB
df = spark.read\
    .format("com.mongodb.spark.sql.DefaultSource")\
    .option("uri", connectionString)\
    .option("database", database)\
    .option("collection", collection)\
    .load()

# Writing on CosmosDB (Appending new information without replacing documents)
dfToAppendOnCosmosDB.write.format("com.mongodb.spark.sql.DefaultSource")\
    .mode("append")\
    .option("uri", connectionString)\
    .option("replaceDocument", False)\
    .option("maxBatchSize", 100)\
    .option("database", database)\
    .option("collection", collection)\
    .save()

我在上找到了配置连接器的选项。

谢谢您的回答。我指定通过远程Python应用程序运行应用程序,而不是通过PySpark shell。因此,作为一名noobpython开发人员,我再次问,如何使用connector包运行我的应用程序。或者我需要使用软件包运行spark master?请更新问题,提供有关如何提交spark作业的更多信息,我将更新我的答案。我更改了使用spark master的方式。我启动星火大师和它的奴隶。之后,我使用mongo spark连接器包和python脚本运行spark submit。我想这是推荐的方法。塔克斯all@Ross我也有同样的问题,似乎无法解决。有什么想法吗?当我遇到这个问题时,我的$spark_HOME/jars(例如/usr/local/spark-2.2.2-bin-hadoop2.7/jars)中没有mongodb-spark-connector_2.11-2.2.3.jar。看看这个在Jupyter笔记本中实现的解决方案:这应该是公认的答案。
spark.jars.packages
选项记录在