Ruby:100的嵌套在将CSV转换为JSON时太深(JSON::NestingError)
我读过一篇文章,但没用 我有一个50万行的Ruby:100的嵌套在将CSV转换为JSON时太深(JSON::NestingError),ruby,json,csv,Ruby,Json,Csv,我读过一篇文章,但没用 我有一个50万行的CSV文件,看起来像这样: contract-number,amendment-number,award-date,contract-value,supplier-name,contracting-entity W8486,0,2014-04-14,14326000,"COMPANY A","Office of Llama Supplies" W8487,0,2014-04-10,150000,"COMPANY B","Foo Bar Dept" W84
CSV
文件,看起来像这样:
contract-number,amendment-number,award-date,contract-value,supplier-name,contracting-entity
W8486,0,2014-04-14,14326000,"COMPANY A","Office of Llama Supplies"
W8487,0,2014-04-10,150000,"COMPANY B","Foo Bar Dept"
W8488,2,2014-03-24,146000,"COMPANY C","Armed Forces"
W8488,1,2014-03-03,68000,"COMPANY C","Armed Forces"
W8488,0,2014-02-17,27760,"COMPANY C","Armed Forces"
W8489,0,2014-02-14,51000000,"COMPANY B","Dept of Magical Affairs"
[{"W8486":
{0:
{
"award-date": 2014-04-14,
"contract-value": 14326000,
"supplier-name": "COMPANY A",
"contracting-entity": "Office of Llama Supplies"
}
}
},
{"W8487":
{0:
{
"award-date": 2014-04-10,
"contract-value": 150000,
"supplier-name": "COMPANY B",
"contracting-entity": "Foo Bar Dept"
}
}
},
{"W8488":
{2:
{
"award-date": 2014-03-24,
"contract-value": 146000,
"supplier-name": "COMPANY C",
"contracting-entity": "Armed Forces"
}
},
{1:
{
"award-date": 2014-03-03,
"contract-value": 68000,
"supplier-name": "COMPANY C",
"contracting-entity": "Armed Forces"
}
},
{0:
{
"award-date": 2014-02-17,
"contract-value": 27760,
"supplier-name": "COMPANY C",
"contracting-entity": "Armed Forces"
}
},
},
{"W8489":
{0:
{
"award-date": 2014-02-14,
"contract-value": 51000000,
"supplier-name": "COMPANY B",
"contracting-entity": "Dept of Magical Affairs"
}
}
}]
许多合同不止一次出现。
我想编写一个Ruby脚本,将我的数据转换为JSON
文件,并将具有相同编号的合同嵌套到同一节点中,如下所示:
contract-number,amendment-number,award-date,contract-value,supplier-name,contracting-entity
W8486,0,2014-04-14,14326000,"COMPANY A","Office of Llama Supplies"
W8487,0,2014-04-10,150000,"COMPANY B","Foo Bar Dept"
W8488,2,2014-03-24,146000,"COMPANY C","Armed Forces"
W8488,1,2014-03-03,68000,"COMPANY C","Armed Forces"
W8488,0,2014-02-17,27760,"COMPANY C","Armed Forces"
W8489,0,2014-02-14,51000000,"COMPANY B","Dept of Magical Affairs"
[{"W8486":
{0:
{
"award-date": 2014-04-14,
"contract-value": 14326000,
"supplier-name": "COMPANY A",
"contracting-entity": "Office of Llama Supplies"
}
}
},
{"W8487":
{0:
{
"award-date": 2014-04-10,
"contract-value": 150000,
"supplier-name": "COMPANY B",
"contracting-entity": "Foo Bar Dept"
}
}
},
{"W8488":
{2:
{
"award-date": 2014-03-24,
"contract-value": 146000,
"supplier-name": "COMPANY C",
"contracting-entity": "Armed Forces"
}
},
{1:
{
"award-date": 2014-03-03,
"contract-value": 68000,
"supplier-name": "COMPANY C",
"contracting-entity": "Armed Forces"
}
},
{0:
{
"award-date": 2014-02-17,
"contract-value": 27760,
"supplier-name": "COMPANY C",
"contracting-entity": "Armed Forces"
}
},
},
{"W8489":
{0:
{
"award-date": 2014-02-14,
"contract-value": 51000000,
"supplier-name": "COMPANY B",
"contracting-entity": "Dept of Magical Affairs"
}
}
}]
到目前为止,我已经成功地使用CSV.foreach do | line |遍历了CSV,将每个项目放入一个散列中。我已经设法检查了行[0]==previousContractNumber
但每次我编写JSON文件时,都会出现以下错误:
nesting of 100 is too deep (JSON::NestingError)
我怎样才能避开这件事
非常感谢 以下是一些应该有效的代码:
您构建输出的方式可能存在错误。如果你能展示你的代码,这会有所帮助。可能会有重大的可伸缩性问题等着你。CSV是逐行读取的,可扩展性很强。通常,JSON被读取为单个字符串,然后解析为单独的对象并进行处理。将500K行转换为500K对象可能会消耗大量内存,并且随着内存的分配、转移和可能的分页,脚本速度会减慢。有一些类似SAX的JSON解析器,所以希望您正在使用它们来处理传入的JSON流。非常感谢大家@乌里·阿加西的解决方案奏效了。您的帮助将极大地帮助我清理这些数据!最后,我有262K条记录,其中128K条与另一条共享一个合同号。运行脚本和编写JSON文件花费了73秒。