Warning: file_get_contents(/data/phpspider/zhask/data//catemap/6/mongodb/11.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Mongodb sqlContext.read.option(…).mongo()在需要身份验证时失败_Mongodb_Apache Spark - Fatal编程技术网

Mongodb sqlContext.read.option(…).mongo()在需要身份验证时失败

Mongodb sqlContext.read.option(…).mongo()在需要身份验证时失败,mongodb,apache-spark,Mongodb,Apache Spark,我正在使用SparkSession、SQLContext和DataFrame。 所以,这个代码是有效的 val sparkSession: SparkSession = SparkSession.builder() .master("local") .appName("MongoSparkConnectorIntro") .config("spark.mongodb.input.uri", "mongodb://127.0.0.1/nasa.eva2") .config("spa

我正在使用SparkSession、SQLContext和DataFrame。 所以,这个代码是有效的

val sparkSession: SparkSession = SparkSession.builder()
  .master("local")
  .appName("MongoSparkConnectorIntro")
  .config("spark.mongodb.input.uri", "mongodb://127.0.0.1/nasa.eva2")
  .config("spark.mongodb.output.uri", "mongodb://127.0.0.1/nasa.astronautTotals")
  .getOrCreate()
val sqlContext: SQLContext = sparkSession.sqlContext

val evadf: DataFrame = sqlContext.read.option("collection", "eva2").mongo()
evadf.printSchema()
它将打印模式

但当我指向需要身份验证的MongoDB设置时,使用不同URI的相同代码失败

val sparkSession: SparkSession = SparkSession.builder()
  .master("local")
  .appName("MongoSparkConnectorIntro")
  .config("spark.mongodb.input.uri", "mongodb://m103-admin:m103-pass@192.168.103.100:27000/nasa.eva2?authSource=admin")
  .config("spark.mongodb.output.uri", "mongodb://m103-admin:m103-pass@192.168.103.100:27000/nasa.astronautTotals?authSource=admin")
.getOrCreate()
val sqlContext: SQLContext = sparkSession.sqlContext

val evadf: DataFrame = sqlContext.read.option("collection", "eva2").mongo()
evadf.printSchema()
为什么?

我正在使用下面的

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.2.1</version>
    <scope>provided</scope>
</dependency>
<dependency>
    <groupId>org.mongodb</groupId>
    <artifactId>mongodb-driver</artifactId>
    <version>3.6.3</version>
</dependency>
<dependency>
    <groupId>org.mongodb.spark</groupId>
    <artifactId>mongo-spark-connector_2.11</artifactId>
    <version>2.2.1</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>2.2.1</version>
</dependency>

org.apache.spark
spark-core_2.11
2.2.1
假如
org.mongodb
mongodb驱动程序
3.6.3
org.mongodb.spark
mongo-spark-connector_2.11
2.2.1
org.apache.spark
spark-sql_2.11
2.2.1

它是如何失败的?你有什么错误反应?我一定是疯了。今天早上代码运行得很好。很高兴看到你能正常工作