Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/visual-studio-2008/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Google app engine 从Dataflow api在数据存储中保存长度超过1500字节的字符串时出错_Google App Engine_Google Cloud Datastore_Google Cloud Dataflow - Fatal编程技术网

Google app engine 从Dataflow api在数据存储中保存长度超过1500字节的字符串时出错

Google app engine 从Dataflow api在数据存储中保存长度超过1500字节的字符串时出错,google-app-engine,google-cloud-datastore,google-cloud-dataflow,Google App Engine,Google Cloud Datastore,Google Cloud Dataflow,当我试图保存一个很长的字符串时,Dataflow作业抛出此错误消息:属性“myProperty”的值超过1500字节,code=INVALID_参数 遵循Google的示例并保存一个字符串longuer然后保存1500字节时出错 我知道,在使用数据存储API时,我可以通过将属性另存为来保存长度超过1500字节的字符串。但是,在示例文档或类文档中没有其他方法可以表明支持该类型 是否有一种方法可以使用该api保存如此长的字符串,以便将其读取为 完整的错误消息如下所示: java.lang.Runti

当我试图保存一个很长的字符串时,Dataflow作业抛出此错误消息:属性“myProperty”的值超过1500字节,code=INVALID_参数

遵循Google的示例并保存一个字符串longuer然后保存1500字节时出错

我知道,在使用数据存储API时,我可以通过将属性另存为来保存长度超过1500字节的字符串。但是,在示例文档或类文档中没有其他方法可以表明支持该类型

是否有一种方法可以使用该api保存如此长的字符串,以便将其读取为

完整的错误消息如下所示:

java.lang.RuntimeException: com.google.cloud.dataflow.sdk.util.UserCodeException: java.lang.RuntimeException: com.google.cloud.dataflow.sdk.util.UserCodeException: java.lang.RuntimeException: com.google.cloud.dataflow.sdk.util.UserCodeException: java.lang.RuntimeException: com.google.cloud.dataflow.sdk.util.UserCodeException: com.google.datastore.v1.client.DatastoreException: The value of property "dalekTestExecutions" is longer than 1500 bytes., code=INVALID_ARGUMENT
    at com.google.cloud.dataflow.sdk.runners.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:162)
    at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnContext.outputWindowedValue(DoFnRunnerBase.java:288)
    at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnContext.outputWindowedValue(DoFnRunnerBase.java:284)
    at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase$DoFnProcessContext$1.outputWindowedValue(DoFnRunnerBase.java:508)
    at com.google.cloud.dataflow.sdk.util.GroupAlsoByWindowsAndCombineDoFn.closeWindow(GroupAlsoByWindowsAndCombineDoFn.java:205)
    at com.google.cloud.dataflow.sdk.util.GroupAlsoByWindowsAndCombineDoFn.processElement(GroupAlsoByWindowsAndCombineDoFn.java:192)
    at com.google.cloud.dataflow.sdk.util.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:49)
    at com.google.cloud.dataflow.sdk.util.DoFnRunnerBase.processElement(DoFnRunnerBase.java:139)
    at com.google.cloud.dataflow.sdk.runners.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:190)
    at com.google.cloud.dataflow.sdk.runners.worker.ForwardingParDoFn.processElement(ForwardingParDoFn.java:42)
    at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerLoggingParDoFn.processElement(DataflowWorkerLoggingParDoFn.java:47)
    at com.google.cloud.dataflow.sdk.util.common.worker.ParDoOperation.process(ParDoOperation.java:55)
    at com.google.cloud.dataflow.sdk.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
    at com.google.cloud.dataflow.sdk.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:224)
    at com.google.cloud.dataflow.sdk.util.common.worker.ReadOperation.start(ReadOperation.java:185)
    at com.google.cloud.dataflow.sdk.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:72)
    at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.executeWork(DataflowWorker.java:287)
    at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.doWork(DataflowWorker.java:223)
    at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:173)
    at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.doWork(DataflowWorkerHarness.java:193)
    at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:173)
    at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:160)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

通过排除索引中的值,可以保存长度超过1500字节的字符串:

Value longString = Value.newBuilder()
    .setStringValue(...)
    .setExcludeFromIndexes(true)
    .build();
如果您需要与App Engine的
com.google.appengine.api.datastore.Text
type兼容,您还需要将含义设置为15:

Value longString = Value.newBuilder()
    .setStringValue(...)
    .setExcludeFromIndexes(true)
    .setMeaning(15)
    .build();
确切地说:

StringValue.newBuilder(yourString).setExcludeFromIndexes(true).build()

数据存储为每个属性创建索引,因此属性的默认限制为1500字节。现在,如果需要存储大JSON之类的数据,则可以通过以下方式指定此属性不需要索引:

Entity newEntity =
                Entity.newBuilder(key)
                        .set("time", Timestamp.parseTimestamp("1970-01-01T00:00:00Z"))
                        .set("message", StringValue.newBuilder(JSON).setExcludeFromIndexes(true).build())
                        .build();

这样,您将能够保存更大的数据,而不是默认的1500字节限制

请在您的答案中添加更多描述。