Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/mysql/72.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何包含MySQL连接器jar_Mysql_Apache Spark - Fatal编程技术网

如何包含MySQL连接器jar

如何包含MySQL连接器jar,mysql,apache-spark,Mysql,Apache Spark,我用MySQL建立了一个数据库。现在我想把它和Spark连接起来。 我使用bin/sparkling-shell-jars-mysql:mysql-connector-java:5.1.36 并得到警告:跳过远程jar mysql:mysql连接器java:5.1.36 我下载了mysql-connector-java-5.1.36.tar.gz,把它放在home/tong/sparkling-water中仍然不起作用 如何包含JDBCJAR文件? 我正在使用Spark 1.4.1,还有其他连接

我用MySQL建立了一个数据库。现在我想把它和Spark连接起来。 我使用bin/sparkling-shell-jars-mysql:mysql-connector-java:5.1.36 并得到警告:跳过远程jar mysql:mysql连接器java:5.1.36

我下载了mysql-connector-java-5.1.36.tar.gz,把它放在home/tong/sparkling-water中仍然不起作用

如何包含JDBCJAR文件? 我正在使用Spark 1.4.1,还有其他连接Mysql和Spark的方法吗

bin/sparkshell——驱动程序类路径/home/tg/sparkling water/mysql-connector-java-5.1.36-bin.jar


val jdbcDF=sqlContext.load(“jdbc”),Map(“url”->”jdbc:mysql://localhost:3306/employee?user=tg&password=*******“,”dbtable“->”employees“)

试一下
--jars path/to/jar/file
。试过了..没用你说没用是什么意思?如果您包含代码和看到的任何错误消息,这将非常有用。bin/sparkling-shell--jars/home/tong/sparkling water/mysql-connector-java-5.1.36-bin.jarWARN NativeCodeLoader:无法为您的平台加载本机hadoop库。。。在适用的情况下使用内置java类
tong@tong-VirtualBox:/usr/local/spark$ bin/sparkling-shell --jars /home/tong/sparkling-water/mysql-connector-java-5.1.36-bin.jar
bash: bin/sparkling-shell: No such file or directory
tong@tong-VirtualBox:/usr/local/spark$ bin/spark-shell --jars /home/tong/sparkling-water/mysql-connector-java-5.1.36-bin.jar
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/09/17 11:19:27 INFO SecurityManager: Changing view acls to: tong
15/09/17 11:19:27 INFO SecurityManager: Changing modify acls to: tong
15/09/17 11:19:27 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tong); users with modify permissions: Set(tong)
15/09/17 11:19:29 INFO HttpServer: Starting HTTP Server
15/09/17 11:19:29 INFO Utils: Successfully started service 'HTTP class server' on port 43333.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.4.1
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_80)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/17 11:19:43 WARN Utils: Your hostname, tong-VirtualBox resolves to a loopback address: 127.0.1.1; using 10.23.36.82 instead (on interface eth0)
15/09/17 11:19:43 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/09/17 11:19:43 INFO SparkContext: Running Spark version 1.4.1
15/09/17 11:19:43 INFO SecurityManager: Changing view acls to: tong
15/09/17 11:19:43 INFO SecurityManager: Changing modify acls to: tong
15/09/17 11:19:43 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tong); users with modify permissions: Set(tong)
15/09/17 11:19:45 INFO Slf4jLogger: Slf4jLogger started
15/09/17 11:19:45 INFO Remoting: Starting remoting
15/09/17 11:19:46 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@10.23.36.82:35469]
15/09/17 11:19:46 INFO Utils: Successfully started service 'sparkDriver' on port 35469.
15/09/17 11:19:46 INFO SparkEnv: Registering MapOutputTracker
15/09/17 11:19:47 INFO SparkEnv: Registering BlockManagerMaster
15/09/17 11:19:47 INFO DiskBlockManager: Created local directory at /tmp/spark-f8f4de26-e607-416f-9fed-f37440bd3878/blockmgr-ed450a0c-5719-4721-98b7-fd6e4664a7d4
15/09/17 11:19:47 INFO MemoryStore: MemoryStore started with capacity 267.3 MB
15/09/17 11:19:47 INFO HttpFileServer: HTTP File server directory is /tmp/spark-f8f4de26-e607-416f-9fed-f37440bd3878/httpd-99f7d1e3-e6d8-4a06-8ae0-65d0fb76a038
15/09/17 11:19:47 INFO HttpServer: Starting HTTP Server
15/09/17 11:19:47 INFO Utils: Successfully started service 'HTTP file server' on port 51511.
15/09/17 11:19:47 INFO SparkEnv: Registering OutputCommitCoordinator
15/09/17 11:19:48 INFO Utils: Successfully started service 'SparkUI' on port 4040.
15/09/17 11:19:48 INFO SparkUI: Started SparkUI at http://10.23.36.82:4040
15/09/17 11:19:48 INFO SparkContext: Added JAR file:/home/tong/sparkling-water/mysql-connector-java-5.1.36-bin.jar at http://10.23.36.82:51511/jars/mysql-connector-java-5.1.36-bin.jar with timestamp 1442506788779
15/09/17 11:19:49 INFO Executor: Starting executor ID driver on host localhost
15/09/17 11:19:49 INFO Executor: Using REPL class URI: http://10.23.36.82:43333
15/09/17 11:19:50 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34826.
15/09/17 11:19:50 INFO NettyBlockTransferService: Server created on 34826
15/09/17 11:19:50 INFO BlockManagerMaster: Trying to register BlockManager
15/09/17 11:19:50 INFO BlockManagerMasterEndpoint: Registering block manager localhost:34826 with 267.3 MB RAM, BlockManagerId(driver, localhost, 34826)
15/09/17 11:19:50 INFO BlockManagerMaster: Registered BlockManager
15/09/17 11:19:51 INFO SparkILoop: Created spark context..
Spark context available as sc.
15/09/17 11:19:54 INFO HiveContext: Initializing execution hive, version 0.13.1
15/09/17 11:19:55 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
15/09/17 11:19:55 INFO ObjectStore: ObjectStore, initialize called
15/09/17 11:19:56 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
15/09/17 11:19:56 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
15/09/17 11:19:56 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/09/17 11:19:57 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/09/17 11:20:04 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
15/09/17 11:20:04 INFO MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5.  Encountered: "@" (64), after : "".
15/09/17 11:20:06 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
15/09/17 11:20:06 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
15/09/17 11:20:13 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
15/09/17 11:20:13 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
15/09/17 11:20:14 INFO ObjectStore: Initialized ObjectStore
15/09/17 11:20:15 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 0.13.1aa
15/09/17 11:20:16 INFO HiveMetaStore: Added admin role in metastore
15/09/17 11:20:16 INFO HiveMetaStore: Added public role in metastore
15/09/17 11:20:17 INFO HiveMetaStore: No user is added in admin role, since config is empty
15/09/17 11:20:18 INFO SessionState: No Tez session required at this point. hive.execution.engine=mr.
15/09/17 11:20:18 INFO SparkILoop: Created sql context (with Hive support)..
SQL context available as sqlContext.