Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/algorithm/12.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 减少引导算法中的内存复杂性_Python_Algorithm_Numpy - Fatal编程技术网

Python 减少引导算法中的内存复杂性

Python 减少引导算法中的内存复杂性,python,algorithm,numpy,Python,Algorithm,Numpy,我写这个函数是为了引导 from __future__ import division import numpy as np def bootstrap(array): """ :type array: np.ndarray :param array: a 1D NumPy array, where array[i] is the count of the i-th item. :rtype: np.ndarray

我写这个函数是为了引导

from __future__ import division
import numpy as np


def bootstrap(array):
    """
    :type array: np.ndarray
    :param array: a 1D NumPy array, where array[i] is the count of the
                  i-th item.
    :rtype: np.ndarray
    """
    size = int(array.sum())
    probabilities = array / size
    values = np.arange(arr.shape[0])
    bins = np.cumsum(probabilities)
    result = np.zeros(arr.shape[0])

    for val in values[np.digitize(np.random.random_sample(size), bins)]:
        result[val] += 1

    return result
我不喜欢这样的事实,这一行
值[np.digitalize(np.random.random\u sample(size),bin)]
创建了size
array.sum()的数组。我想保留内存复杂性
O(n)
,其中
n
数组的长度。我怎样才能在不牺牲速度的情况下让它变得更懒呢

附言。 如果你想给我指出慢速的
for
-循环,我已经使用了真实的版本来删除慢速的Python循环