Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/ruby/22.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Ruby on rails can';t将数组转换为字符串_Ruby On Rails_Ruby_Json_Chef Infra - Fatal编程技术网

Ruby on rails can';t将数组转换为字符串

Ruby on rails can';t将数组转换为字符串,ruby-on-rails,ruby,json,chef-infra,Ruby On Rails,Ruby,Json,Chef Infra,我的配方/default.rb中有以下命令: data = data_bag_item('config','info') bash "setup_hadoop" do user "#{node[:cluster][:user]}" group "#{node[:cluster][:user]}" cwd "#{node[:cluster][:home]}" code <<-EOF wget http://apache.mirror.gtcomm

我的
配方/default.rb中有以下命令:

 data = data_bag_item('config','info')

 bash "setup_hadoop" do
   user "#{node[:cluster][:user]}"
   group "#{node[:cluster][:user]}"
   cwd "#{node[:cluster][:home]}"
   code <<-EOF
     wget http://apache.mirror.gtcomm.net/hadoop/common/hadoop-1.2.1/hadoop-1.2.1.tar.gz
     tar -xzvf hadoop-1.2.1.tar.gz
     mv hadoop-1.2.1 hadoop
   EOF

    template "#{node[:cluster][:hadoop]}/hadoop-env.sh" do
       source "hadoop-env.sh.erb"
       mode 0755
       owner "#{node[:cluster][:user]}"
       group "#{node[:cluster][:user]}"
       variables( :cluster_size => data['cluster_size'] )
   end
然而,当我尝试使用数据包时,我得到了以下错误:

[2014-05-05T14:35:45+00:00] INFO: Running start handlers
[2014-05-05T14:35:45+00:00] INFO: Start handlers complete.
[2014-05-05T14:35:45+00:00] ERROR: Failed to load data bag item: "config" "info"

================================================================================
Recipe Compile Error in /tmp/vagrant-chef-6/chef-solo-1/cookbooks/cluster/recipes/default.rb
================================================================================


TypeError
---------
can't convert Array into String


Cookbook Trace:
---------------
  /tmp/vagrant-chef-6/chef-solo-1/cookbooks/cluster/recipes/default.rb:10:in `from_file'


Relevant File Content:
----------------------
/tmp/vagrant-chef-6/chef-solo-1/cookbooks/cluster/recipes/default.rb:
    TypeError
    ---------
can't convert Array into String
 10>> data = data_bag_item('config','info')
 11:  
 12:  bash "setup_hadoop" do
 13:     user "#{node[:cluster][:user]}"
 14:     group "#{node[:cluster][:user]}"
 15:     cwd "#{node[:cluster][:home]}"
 16:     code <<-EOF
 17:     wget http://apache.mirror.gtcomm.net/hadoop/common/hadoop-1.2.1/hadoop-1.2.1.tar.gz
 18:     tar -xzvf hadoop-1.2.1.tar.gz
 19:     mv hadoop-1.2.1 hadoop
 20:     variables( :cluster_size => data['cluster_size'] )

这似乎是正确的。有人能帮我吗?我按照教程创建了数据包。

你为什么要做
“#{node[:cluster][:user]}”
而不是仅仅做
节点[:cluster][:user]
?你具体做了什么导致出现此错误?@JustinWood我正在运行
vagrant up--provider=aws
,但这就是代码。。。对于你的第一个问题,我正在替换字符串中的一个变量,你能添加堆栈跟踪的相关部分吗?这一切都是StackOverflow,不是吗?@Patru我马上就去做
[2014-05-05T14:35:45+00:00] INFO: Running start handlers
[2014-05-05T14:35:45+00:00] INFO: Start handlers complete.
[2014-05-05T14:35:45+00:00] ERROR: Failed to load data bag item: "config" "info"

================================================================================
Recipe Compile Error in /tmp/vagrant-chef-6/chef-solo-1/cookbooks/cluster/recipes/default.rb
================================================================================


TypeError
---------
can't convert Array into String


Cookbook Trace:
---------------
  /tmp/vagrant-chef-6/chef-solo-1/cookbooks/cluster/recipes/default.rb:10:in `from_file'


Relevant File Content:
----------------------
/tmp/vagrant-chef-6/chef-solo-1/cookbooks/cluster/recipes/default.rb:
    TypeError
    ---------
can't convert Array into String
 10>> data = data_bag_item('config','info')
 11:  
 12:  bash "setup_hadoop" do
 13:     user "#{node[:cluster][:user]}"
 14:     group "#{node[:cluster][:user]}"
 15:     cwd "#{node[:cluster][:home]}"
 16:     code <<-EOF
 17:     wget http://apache.mirror.gtcomm.net/hadoop/common/hadoop-1.2.1/hadoop-1.2.1.tar.gz
 18:     tar -xzvf hadoop-1.2.1.tar.gz
 19:     mv hadoop-1.2.1 hadoop
 20:     variables( :cluster_size => data['cluster_size'] )
knife data bag show config info
cluster_size: 2
id:           info