neo4j加载csv非常慢
我正在尝试将CSV文件加载到GrapheneDB上托管的Neo4j数据库中。它在第一个包含5000行的文件上运行良好。完成此文件大约需要16秒 我现在导入第二个具有相同模式和相同行数的文件。数据完全不同。密码查询已经运行了30多分钟,但仍未完成。我不知道它在做什么,为什么这么慢。这是我的密码:neo4j加载csv非常慢,neo4j,cypher,load-csv,graphenedb,Neo4j,Cypher,Load Csv,Graphenedb,我正在尝试将CSV文件加载到GrapheneDB上托管的Neo4j数据库中。它在第一个包含5000行的文件上运行良好。完成此文件大约需要16秒 我现在导入第二个具有相同模式和相同行数的文件。数据完全不同。密码查询已经运行了30多分钟,但仍未完成。我不知道它在做什么,为什么这么慢。这是我的密码: USING PERIODIC COMMIT 500 LOAD CSV WITH HEADERS FROM 'http://example.com/some.csv' AS line Match (c:Cu
USING PERIODIC COMMIT 500
LOAD CSV WITH HEADERS FROM 'http://example.com/some.csv' AS line
Match (c:Customer {customerID: line.CustomerID})
MERGE (c)<-[r:DEPT_OF]-(dept:Dept { name: line.Category})
ON CREATE
SET dept.name = line.Category, dept.deptID=line.DeptID, dept.createdDTS=1453478149463
MERGE (dept)<-[r1:IN_DEPT]-(pt:ProductType {name: dept.name})
ON CREATE
SET pt.name = dept.name, pt.packQty = line.PackQty, pt.createdDTS = 1453478149463,
pt.productTypeID = line.ProductTypeID
MERGE (pt)<-[r2:OF_TYPE]-(st:Style {name: line.Style})
ON CREATE
SET st.name = line.Style, st.styleID = line.StyleID, st.styleNum = line.StyleNo, st.price = line.Price
MERGE (st)<-[r3:OF_STYLE]-(p:Product {productNum: line.UPC})
ON CREATE
SET p.floorMin = line.MinFloor, p.floorMax = line.FloorMax, p.color = line.Color, p.createdDTS = 1453478149463,
p.size = line.Size, p.productID = line.ProductID;
更新1:
根据Nicole的回复,我添加了以下索引
USING PERIODIC COMMIT 500
LOAD CSV WITH HEADERS FROM '...' AS line
Match (c:Customer {customerID: line.CustomerID})
MERGE (c)<-[r:DEPT_OF]-(dept:Dept { name: line.Category })
ON CREATE
SET dept.name = dept.name, dept.deptID=line.categoryID, dept.createdDTS=1453742532269, dept.modifiedDTS = 1453742532269
MERGE (c)<-[r22:DEPT_OF]-(dept)
MERGE (dept)<-[r1:IN_DEPT]-(pt:ProductType {name: dept.name})
ON CREATE
SET pt.name = dept.name, pt.packQty = line.PackQty, pt.createdDTS = 1453742532269, pt.productTypeID = line.ProductTypeID, pt.modifiedDTS = 1453742532269
MERGE (c)<-[r2:DEPT_OF]-(dept)
MERGE (dept)<-[r3:IN_DEPT]-(pt)
MERGE (pt)<-[r4:OF_TYPE]-(st:Style {name: line.Style})
ON CREATE
SET st.name = line.Style, st.styleID = line.StyleID, st.styleNum = line.StyleNo, st.price = line.Price, st.modifiedDTS = 1453742532269, st.createdDTS = 1453742532269
MERGE (c)<-[r5:DEPT_OF]-(dept)
MERGE (dept)<-[r6:IN_DEPT]-(pt)
MERGE (pt)<-[r7:OF_TYPE]-(st)
MERGE (st)<-[r8:OF_STYLE]-(p:Product {productNum: line.UPC})
ON CREATE
SET p.floorMin = line.MinFloor, p.floorMax = line.FloorMax, p.color = line.Color, p.createdDTS = 1453742532269,p.modifiedDTS = 1453742532269, p.size = line.Size, p.productID = line.ProductID;
使用定期提交500
将“…”中的标题作为行加载到CSV
匹配(c:Customer{customerID:line.customerID})
合并(c)你有任何索引吗?没有。我如何添加索引?我不确定应该在哪些节点上添加索引。你能推荐一些基于密码的吗?你好@skone,我是GrapheneDB的首席执行官。除了明显的(索引、每次提交的运行次数)之外,性能还取决于您使用的GrapheneDB计划。您目前使用的是哪种计划?顺便说一句,如果您遇到任何问题,请随时联系我们的支持团队。为什么您的部门ID不同但名称相同?是否希望它们是同一个节点?
USING PERIODIC COMMIT 500
LOAD CSV WITH HEADERS FROM '...' AS line
Match (c:Customer {customerID: line.CustomerID})
MERGE (c)<-[r:DEPT_OF]-(dept:Dept { name: line.Category })
ON CREATE
SET dept.name = dept.name, dept.deptID=line.categoryID, dept.createdDTS=1453742532269, dept.modifiedDTS = 1453742532269
MERGE (c)<-[r22:DEPT_OF]-(dept)
MERGE (dept)<-[r1:IN_DEPT]-(pt:ProductType {name: dept.name})
ON CREATE
SET pt.name = dept.name, pt.packQty = line.PackQty, pt.createdDTS = 1453742532269, pt.productTypeID = line.ProductTypeID, pt.modifiedDTS = 1453742532269
MERGE (c)<-[r2:DEPT_OF]-(dept)
MERGE (dept)<-[r3:IN_DEPT]-(pt)
MERGE (pt)<-[r4:OF_TYPE]-(st:Style {name: line.Style})
ON CREATE
SET st.name = line.Style, st.styleID = line.StyleID, st.styleNum = line.StyleNo, st.price = line.Price, st.modifiedDTS = 1453742532269, st.createdDTS = 1453742532269
MERGE (c)<-[r5:DEPT_OF]-(dept)
MERGE (dept)<-[r6:IN_DEPT]-(pt)
MERGE (pt)<-[r7:OF_TYPE]-(st)
MERGE (st)<-[r8:OF_STYLE]-(p:Product {productNum: line.UPC})
ON CREATE
SET p.floorMin = line.MinFloor, p.floorMax = line.FloorMax, p.color = line.Color, p.createdDTS = 1453742532269,p.modifiedDTS = 1453742532269, p.size = line.Size, p.productID = line.ProductID;