Python Numba函数与pickle
我在使用joblib并行执行递归numba函数时遇到问题。当在递归函数上使用numba的jit,然后尝试在该函数上使用joblib时,我得到了一个错误(在最后重现) 你能想到什么解决办法吗?我唯一能想到的就是不递归地重新编写函数。 我通常会在github上报告这一点,但我不知道这是谁的问题,cloudpickly还是numba。你觉得怎么样 谢谢 此代码再现了以下问题:Python Numba函数与pickle,python,numba,Python,Numba,我在使用joblib并行执行递归numba函数时遇到问题。当在递归函数上使用numba的jit,然后尝试在该函数上使用joblib时,我得到了一个错误(在最后重现) 你能想到什么解决办法吗?我唯一能想到的就是不递归地重新编写函数。 我通常会在github上报告这一点,但我不知道这是谁的问题,cloudpickly还是numba。你觉得怎么样 谢谢 此代码再现了以下问题: 从joblib并行导入,延迟 来自numba import njit,int64 @njit(int64(int64)) de
从joblib并行导入,延迟
来自numba import njit,int64
@njit(int64(int64))
def df(n):
如果n这是一个bug,将在Numba中修复(我想在版本0.41中)看起来更像是一个Numba问题,还有内置的pickle的错误
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/pickle.py", line 504, in save
f(self, obj) # Call unbound method with explicit self
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/pickle.py", line 856, in save_dict
self._batch_setitems(obj.items())
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/pickle.py", line 882, in _batch_setitems
save(v)
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/pickle.py", line 524, in save
rv = reduce(self.proto)
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/numba/dispatcher.py", line 585, in __reduce__
globs = self._compiler.get_globals_for_reduction()
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/numba/dispatcher.py", line 89, in get_globals_for_redu
ction
return serialize._get_function_globals_for_reduction(self.py_func)
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/numba/serialize.py", line 55, in _get_function_globals
_for_reduction
func_id = bytecode.FunctionIdentity.from_function(func)
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/numba/bytecode.py", line 291, in from_function
func = get_function_object(pyfunc)
RecursionError: maximum recursion depth exceeded
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/joblib/externals/loky/backend/queues.py", line 151, in
_feed
obj, reducers=reducers)
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/joblib/externals/loky/backend/reduction.py", line 145,
in dumps
p.dump(obj)
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/joblib/parallel.py", line 290, in __getstate__
for func, args, kwargs in self.items]
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/joblib/parallel.py", line 290, in <listcomp>
for func, args, kwargs in self.items]
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/joblib/parallel.py", line 278, in _wrap_non_picklable_
objects
wrapped_obj = CloudpickledObjectWrapper(obj)
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/joblib/parallel.py", line 208, in __init__
self.pickled_obj = dumps(obj)
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/joblib/externals/cloudpickle/cloudpickle.py", line 918
, in dumps
cp.dump(obj)
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/joblib/externals/cloudpickle/cloudpickle.py", line 272
, in dump
raise pickle.PicklingError(msg)
_pickle.PicklingError: Could not pickle object as excessively deep recursion required.
"""
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/joblib/parallel.py", line 996, in __call__
self.retrieve()
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/joblib/parallel.py", line 899, in retrieve
self._output.extend(job.get(timeout=self.timeout))
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/site-packages/joblib/_parallel_backends.py", line 517, in wrap_futur
e_result
return future.result(timeout=timeout)
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/concurrent/futures/_base.py", line 432, in result
return self.__get_result()
File "/home/.../anaconda3/envs/test_bug/lib/python3.7/concurrent/futures/_base.py", line 384, in __get_result
raise self._exception
_pickle.PicklingError: Could not pickle the task to send it to the workers