Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/315.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/loops/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 优雅的编码尝试:除了链_Python_Loops_Concurrent.futures - Fatal编程技术网

Python 优雅的编码尝试:除了链

Python 优雅的编码尝试:除了链,python,loops,concurrent.futures,Python,Loops,Concurrent.futures,我有一个in concurrent.futures,它通常与chunksize=1一起工作。然而,我偶尔会碰到一个需要更大块大小的大型数据集。目前,我已通过以下代码解决了此问题: for i in datasets: try: with concurrent.futures.ProcessPoolExecutor() as executor: results=tuple(executor.map(do_something, parameters,

我有一个in concurrent.futures,它通常与chunksize=1一起工作。然而,我偶尔会碰到一个需要更大块大小的大型数据集。目前,我已通过以下代码解决了此问题:

for i in datasets:
    try:
        with concurrent.futures.ProcessPoolExecutor() as executor:
            results=tuple(executor.map(do_something, parameters, chunksize=1)
    except concurrent.futures.process.BrokenProcessPool:
        try:
            with concurrent.futures.ProcessPoolExecutor() as executor:
                results=tuple(executor.map(do_something, parameters, chunksize=2)
        except concurrent.futures.process.BrokenProcessPool:
            try:
                with concurrent.futures.ProcessPoolExecutor() as executor:
                    results=tuple(executor.map(do_something, parameters, chunksize=4)
            etc. etc. etc....
            except concurrent.futures.process.BrokenProcessPool:
                   print('code failed')

这很好,但显然是不雅观和丑陋的。无论如何,我可以更简单地执行此操作吗?

您可以使用
for
循环来迭代块大小,
如果成功,则中断该循环,或者输入
else
块来输出错误消息:

for i in datasets:
    for chunksize in 1, 2, 4:
        try:
            with concurrent.futures.ProcessPoolExecutor() as executor:
                results=tuple(executor.map(do_something, parameters, chunksize=chunksize)
            break
        except concurrent.futures.process.BrokenProcessPool:
            pass
    else:
        print('code failed')

您应该使用另一个循环来实现这一点。每次执行的代码都是相同的,但块大小不同。下面是一个使用
for
循环的示例

for i in datasets:

    for chunksize in [1,2,4]:
      try:
          with concurrent.futures.ProcessPoolExecutor() as executor:
              results=tuple(executor.map(do_something, parameters, chunksize=chunksize)
          break
      except concurrent.futures.process.BrokenProcessPool:
          if chunksize == 4:
            raise SomeErrorForMaxRetries
          else:
            continue

是否每次都需要创建新的执行者?我想您可以将
executor=concurrent.futures.ProcessPoolExecutor()
放在循环之前,只需在循环内部对executor:
使用
。您可能会这样做,但OP每次都创建一个新的块,因此遵循该逻辑。您也可以在循环本身的
else
子句中使用put
raise SomeErrorForMaxRetries
,而不是在每次捕获异常时检查当前块大小。
else
子句仅在
break
从未成功执行时执行。(就像blhsing一样,如果我早点向下滚动的话,我会注意到。)是的,这也是可行的,但我看到其他人已经在他们的回答中提到了这一点,在我看来,没有必要重复,特别是在风格上的改变