Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/amazon-s3/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop Flume无法将文件放入S3存储桶_Hadoop_Amazon S3_Amazon_Amqp_Flume - Fatal编程技术网

Hadoop Flume无法将文件放入S3存储桶

Hadoop Flume无法将文件放入S3存储桶,hadoop,amazon-s3,amazon,amqp,flume,Hadoop,Amazon S3,Amazon,Amqp,Flume,我在仅节点(测试)模式下使用flume;flume从RabbitMQ提取消息并将其放入AmazonS3存储桶 问题是: Flume实际上从RabbitMQ中提取,但文件不会出现在S3存储桶中 技术细节: 我以以下方式启动水槽: flume node -1 -c $FQDN':amqp("exchangeName=[exchange name]", "bindings=[binding name]", "host=127.0.0.1", "port=5672", "userName=[user]"

我在仅节点(测试)模式下使用flume;flume从RabbitMQ提取消息并将其放入AmazonS3存储桶

问题是: Flume实际上从RabbitMQ中提取,但文件不会出现在S3存储桶中

技术细节: 我以以下方式启动水槽:

flume node -1 -c $FQDN':amqp("exchangeName=[exchange name]", "bindings=[binding name]", "host=127.0.0.1", "port=5672", "userName=[user]", "password=[pass]", "exchangeType=direct", "durableExchange=false", "queueName=[queue name]", "durableQueue=true", "exclusiveQueue=false", "autoDeleteQueue=false", "useMessageTimestamp=true")|collectorSink("s3n://[Amazon key]:[Amazon secret]@[path at S3]","server");' -s "$@" > "$log" 2>&1
水槽日志: flume重新启动后,通过相关exchange和队列发送内容时,flume日志中将显示以下行:

INFO com.cloudera.flume.handlers.hdfs.EscapedCustomDfsSink: Opening s3n://[key]:[secret]@[path at S3]

WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

WARN org.apache.hadoop.io.compress.snappy.LoadSnappy: Snappy native library not loaded

你不会相信的

Flume无法使用包含“/”的Amazon密钥进行管理 我用的钥匙里有一把。 然后Flume意识到了这一点,只是没有叫亚马逊

解决方案:
更改Amazon键,直到得到一个没有“/”的键。

Amazon是否有S3操作的日志,能否将日志设置到更高的级别(如果可能,进行调试)以查看更详细的错误信息?是,Amazon有S3日志记录,它是打开的。没有什么可疑的…所以你真的在这些日志中看到Flume的活动了吗?或者没有任何显示,因为这表明它甚至没有连接到S3