进口100000+;记录从JSON-慢速命中到AT_实体

进口100000+;记录从JSON-慢速命中到AT_实体,json,moqui,bitronix,Json,Moqui,Bitronix,在JSON文件中,大约有100000条记录。我正在尝试将它们全部写入mantle.product.product实体 该过程开始,在大约35000条记录时,它开始恶化,并发出警告“缓慢命中at_实体:create:mantle.product.product”。然后它肯定会因“java.lang.OutOfMemoryError:GC开销限制超出了”错误而停止。这种行为在我的电脑上 欢迎任何提示 代码如下: void processJson2(String filePath) { //d

在JSON文件中,大约有100000条记录。我正在尝试将它们全部写入mantle.product.product实体

该过程开始,在大约35000条记录时,它开始恶化,并发出警告“缓慢命中at_实体:create:mantle.product.product”。然后它肯定会因“java.lang.OutOfMemoryError:GC开销限制超出了”错误而停止。这种行为在我的电脑上

欢迎任何提示

代码如下:

void processJson2(String filePath) {
    //def json = new JsonSlurper().parseText(new BufferedReader(new InputStreamReader(this.getFileIO().openStream(), "UTF-8")))

    //will initialize class manually
    def docReadReference = this.executionContext.resource.getLocationReference(filePath)

    if (docReadReference.isFile()) {
        //inputstream
        InputStream inputFile = docReadReference.openStream()
        TransactionFacade trxFacade = this.executionContext.getTransaction()

        this.executionContext.artifactExecution.disableTarpit()
        this.executionContext.artifactExecution.disableEntityEca()
        this.executionContext.artifactExecution.disableAuthz()

        trxFacade.runRequireNew(50000, "Error loading entity JSON data", {

            try {
                logMachine.info("Opening file ${docReadReference.isFile()}")

                JsonSlurper slurper = new JsonSlurper().setType(JsonParserType.CHARACTER_SOURCE)
                def json = slurper.parse(new BufferedReader(new InputStreamReader(inputFile, "UTF-8")))

                //writer
                Long counter = 1

                json.each {
                    this.executionContext.service.sync().name("create", "mantle.product.Product").parameters([productId: it.sourceFileReference]).call()

                    //display thousands
                    if (counter % 1000 == 0) {
                        logMachine.info("JSON rows processed ${counter} > ${it.sourceFileReference}")
                    }

                    //move counter
                    counter += 1
                }

                //log
                logMachine.info("File processed.")

            } catch (Throwable t) {
                trxFacade.rollback("Error while processing JSON", t);

                //log as warning
                logMachine.warn("Incorrectly handled JSON parsing ${t.message}.")
            } finally {
                if (trxFacade.isTransactionInPlace()) trxFacade.commit();

                inputFile.close()

                this.executionContext.artifactExecution.enableTarpit()
                this.executionContext.artifactExecution.enableEntityEca()
                this.executionContext.artifactExecution.enableAuthz()
            }
        })
    }
}

这似乎很好,所以如果有人有类似的问题,它可能会有所帮助

  • 因为我使用的是MoquiDevConf,所以解决慢命中问题的第一件事是删除for类型AT_实体
  • 接下来,BufferedReader不是读取数据的最有效解决方案,我使用InputStream初始化json ArrayList
  • 结果是:

    InputStream inputFile = docReadReference.openStream()
            TransactionFacade trxFacade = this.executionContext.getTransaction()
    
    
            JsonSlurper slurper = new JsonSlurper().setType(JsonParserType.INDEX_OVERLAY)
            //ArrayList<Object> json = slurper.parse(new BufferedReader(new InputStreamReader(inputFile, "UTF-8")))
            ArrayList<Object> json = slurper.parse(inputFile)
    
    InputStream inputFile=docReadReference.openStream()
    TransactionFacade trxFacade=this.executionContext.getTransaction()
    JsonSlurper slurper=新的JsonSlurper().setType(JsonParserType.INDEX_覆盖)
    //ArrayList json=slurper.parse(新的BufferedReader(新的InputStreamReader(inputFile,“UTF-8”))
    ArrayList json=slurper.parse(inputFile)