Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/jpa/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Spring Boot JPA批量插入_Jpa_Spring Boot_Spring Data Jpa_Hibernate Batch Updates - Fatal编程技术网

Spring Boot JPA批量插入

Spring Boot JPA批量插入,jpa,spring-boot,spring-data-jpa,hibernate-batch-updates,Jpa,Spring Boot,Spring Data Jpa,Hibernate Batch Updates,我有三个实体父母,孩子,子孩子。父项是子项的父项,子项是子项的父项。我需要插入大约700个父对象。父对象可以有50个子对象。子对象可以有50个子对象。 我尝试了普通的repository.save(ListoObjects)大约需要4分钟 然后我尝试使用实体管理器的持久化,刷新和清除,基于批量大小(500)。这也花了大约4分钟。 在表现上没有太大差别。请建议一种有效插入如此大量数据的最佳方法 母公司 @Entity public class Parent { @Id @GeneratedValu

我有三个实体父母,孩子,子孩子。父项是子项的父项,子项是子项的父项。我需要插入大约700个父对象。父对象可以有50个子对象。子对象可以有50个子对象。 我尝试了普通的
repository.save(ListoObjects)
大约需要4分钟

然后我尝试使用实体管理器的
持久化
刷新
清除
,基于批量大小(500)。这也花了大约4分钟。 在表现上没有太大差别。请建议一种有效插入如此大量数据的最佳方法

母公司

@Entity
public class Parent {
@Id @GeneratedValue(strategy= GenerationType.AUTO)
private Long parentId;
private String aaa;
private String bbb;
private String ccc;
@Version
private Long version;
@OneToMany(cascade = CascadeType.ALL, orphanRemoval = true, mappedBy = "parent", fetch = FetchType.LAZY)
@JoinColumnsOrFormulas({
@JoinColumnOrFormula(column=@JoinColumn(name="parentId",referencedColumnName="parentId",nullable=false))})
private List<Child> childs = new ArrayList<>();
public Long getParentId() {
    return parentId;
}
public void setParentId(Long parentId) {
    this.parentId = parentId;
}
public String getAaa() {
    return aaa;
}
public void setAaa(String aaa) {
    this.aaa = aaa;
}
public String getBbb() {
    return bbb;
}
public void setBbb(String bbb) {
    this.bbb = bbb;
}
public String getCcc() {
    return ccc;
}
public void setCcc(String ccc) {
    this.ccc = ccc;
}
public Long getVersion() {
    return version;
}
public void setVersion(Long version) {
    this.version = version;
}
public List<Child> getChilds() {
    return childs;
}
public void setChilds(List<Child> childs) {
    this.childs = childs;
}
}
用于持久化父实体列表的存储库方法

@Value("${spring.jpa.hibernate.jdbc.batch_size}")
private int batchSize;

public <T extends Parent> Collection<T> bulkSave(Collection<T> entities) {
    final List<T> savedEntities = new ArrayList<T>(entities.size());
    int i = 0;
    for (T t : entities) {
        savedEntities.add(persistOrMerge(t));
        i++;
        if (i % batchSize == 0) {
            // Flush a batch of inserts and release memory.
            entityManager.flush();
            entityManager.clear();
        }
    }
    return savedEntities;
}
private <T extends Parent> T persistOrMerge(T t) {
    if (t.getTimeSlotId() == null) {
        entityManager.persist(t);
        return t;
    } else {
        return entityManager.merge(t);
    }
}

要启用批插入,您需要配置中的batch_size属性


另外,由于jdbc批处理只能针对一个表,因此您需要
spring.jpa.hibernate.order_inserts=true
属性来对父级和子级之间的插入进行排序,否则语句无序,您将看到部分批处理(只要调用不同表中的插入,就会出现新批处理)

刷新时,您会得到50个或一个插入?如果我错了,请纠正我,因为事务是由@transactional注释处理的,我认为提交发生在最后阶段。所以我不认为我们刷新时会插入任何记录。对不起,我以为你会在每批之后提交。因此,在提交时。。日志显示每个对象?的插入?。。那么700?@MaciejKowalski它在提交时插入700个对象
@Entity
public class SubChild {
@Id @GeneratedValue(strategy= GenerationType.AUTO)
private Long subChildId;
private String fff;
private String ggg;
private Integer hhh;

@ManyToOne(fetch = FetchType.LAZY)
@JoinColumnsOrFormulas({
    @JoinColumnOrFormula(column= @JoinColumn( name="childId",referencedColumnName="childId",nullable=false))
})
private Child child;

public Long getSubChildId() {
    return subChildId;
}
public void setSubChildId(Long subChildId) {
    this.subChildId = subChildId;
}
public String getFff() {
    return fff;
}
public void setFff(String fff) {
    this.fff = fff;
}
public String getGgg() {
    return ggg;
}
public void setGgg(String ggg) {
    this.ggg = ggg;
}
public Integer getHhh() {
    return hhh;
}
public void setHhh(Integer hhh) {
    this.hhh = hhh;
}
public Child getChild() {
    return child;
}
public void setChild(Child child) {
    this.child = child;
}
}
@Value("${spring.jpa.hibernate.jdbc.batch_size}")
private int batchSize;

public <T extends Parent> Collection<T> bulkSave(Collection<T> entities) {
    final List<T> savedEntities = new ArrayList<T>(entities.size());
    int i = 0;
    for (T t : entities) {
        savedEntities.add(persistOrMerge(t));
        i++;
        if (i % batchSize == 0) {
            // Flush a batch of inserts and release memory.
            entityManager.flush();
            entityManager.clear();
        }
    }
    return savedEntities;
}
private <T extends Parent> T persistOrMerge(T t) {
    if (t.getTimeSlotId() == null) {
        entityManager.persist(t);
        return t;
    } else {
        return entityManager.merge(t);
    }
}
spring:
  application:
    name: sample-service
  jpa:
    database: MYSQL
    show-sql: true
    hibernate:
      ddl-auto: update
      dialect: org.hibernate.dialect.MySQL5Dialect
      naming_strategy: org.hibernate.cfg.ImprovedNamingStrategy
      jdbc:
        batch_size: 100
  jackson:
    date-format: dd/MM/yyyy
  thymeleaf:
    cache: false
spring.datasource.url : jdbc:mysql://${dbhost}/sample?createDatabaseIfNotExist=true
spring.datasource.username : root
spring.datasource.password : root
spring.datasource.driver-class-name : com.mysql.cj.jdbc.Driver