Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/356.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/arrays/14.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python x32机器上的numpy.memmap最大阵列大小?_Python_Arrays_Memory_Out Of Memory_Bigdata - Fatal编程技术网

Python x32机器上的numpy.memmap最大阵列大小?

Python x32机器上的numpy.memmap最大阵列大小?,python,arrays,memory,out-of-memory,bigdata,Python,Arrays,Memory,Out Of Memory,Bigdata,我在x32 win xp上使用python x32 有时程序在线失败 fp = np.memmap('C:/memmap_test', dtype='float32', mode='w+', shape=(rows,cols)) memmap.py中出错 Traceback (most recent call last): fp = np.memmap('C:/memmap_test', dtype='float32', mode='w+', shape=(rows,cols))

我在x32 win xp上使用python x32

有时程序在线失败

fp = np.memmap('C:/memmap_test', dtype='float32', mode='w+', shape=(rows,cols))
memmap.py中出错

Traceback (most recent call last):
    fp = np.memmap('C:/memmap_test', dtype='float32', mode='w+', shape=(rows,cols))   File "C:\Python27\lib\site-packages\numpy\core\memmap.py", line 253, in __new__
    mm = mmap.mmap(fid.fileno(), bytes, access=acc, offset=start)
OverflowError: cannot fit 'long' into an index-sized integer
所以我假设数组的大小有限制,那么数组maxN=rows*cols的最大大小是多少

同样的问题也适用于 1.pythonx32赢得x64和x62。Pythonx64赢x64

更新:

#create array
rows= 250000
cols= 1000
fA= np.memmap('A.npy', dtype='float32', mode='w+', shape=(rows,cols))
# fA1= np.memmap('A1.npy', dtype='float32', mode='w+', shape=(rows,cols)) # can't create another one big memmap
print fA.nbytes/1024/1024 # 953 mb

因此,似乎还有另一个局限性,不仅仅是这里有一些关于这个主题的讨论:和

对于以下测试,我使用以下代码:

baseNumber = 3000000L

for powers in arange(1,7):
  l1 = baseNumber*10**powers
  print('working with %d elements'%(l1))
  print('number bytes required %f GB'%(l1*8/1e9))
  try:
    fp = numpy.memmap('test.map',dtype='float64', mode='w+',shape=(1,l1))
    #works 
    print('works')
    del fp
  except Exception as e:
    print(repr(e))
windows x32上的python x32 对于32位窗口,文件大小的限制约为2-3GB。因此,由于操作系统的限制,windows无法创建任何大于此文件大小的文件。我没有访问x32位计算机的权限,但在达到文件大小限制后,命令将失败

windows x64上的python x32

在本例中,由于python是32位的,我们无法达到win64上允许的文件大小

%run -i scratch.py

python x32 win x64
working with 30000000 elements
number bytes required 0.240000 GB
works
working with 300000000 elements
number bytes required 2.400000 GB
OverflowError("cannot fit 'long' into an index-sized integer",)
working with 3000000000 elements
number bytes required 24.000000 GB
OverflowError("cannot fit 'long' into an index-sized integer",)
working with 30000000000 elements
number bytes required 240.000000 GB
IOError(28, 'No space left on device')
working with 300000000000 elements
number bytes required 2400.000000 GB
IOError(28, 'No space left on device')
working with 3000000000000 elements
number bytes required 24000.000000 GB
IOError(22, 'Invalid argument')
windows x64上的python x64

在这种情况下,我们最初受到磁盘大小的限制,但一旦阵列/字节大小足够大,就会出现一些溢出

%run -i scratch.py
working with 30000000 elements
number bytes required 0.240000 GB
works
working with 300000000 elements
number bytes required 2.400000 GB
works
working with 3000000000 elements
number bytes required 24.000000 GB
works
working with 30000000000 elements
number bytes required 240.000000 GB
IOError(28, 'No space left on device')
working with 300000000000 elements
number bytes required 2400.000000 GB
IOError(28, 'No space left on device')
working with 3000000000000 elements
number bytes required 24000.000000 GB
IOError(22, 'Invalid argument')
总之: 阵列发生故障的精确点取决于windows x64的初始磁盘大小

pythonx32 windows x64 最初我们会看到您看到的类型错误,然后是磁盘大小限制,但在某些时候会出现无效参数错误

pythonx64 windows x64 最初我们有磁盘大小限制,但在某些时候会出现其他错误。
有趣的是,这些错误似乎与264大小的问题无关,因为3000000000*8<264与这些错误在win32上的表现方式相同


如果磁盘足够大,那么我们就不会看到无效参数错误,我们可以达到2**64限制,尽管我没有足够大的磁盘来测试:)

“文件视图的大小被限制为最大的可用连续无保留虚拟内存块。这最多是2GB减去进程已保留的虚拟内存“那么,我们如何创建一个大的memmap文件,以便按小块查看它呢?是的,你是对的。在这种情况下,您必须创建自己的映射。您可以一次处理2GB块。通过提供起始位置和偏移量,执行一些处理,然后处理下一个块。dequeue的第二个链接回答概括了这一点,但肯定需要进行一些手动簿记。问题是我不知道如何不使用memmap构造函数创建大文件。在某些版本的win32(非NT)上,您无法这样做,因为操作系统本身将文件限制为2GB。如果你需要一个10GB的文件,你必须创建5个较小的文件,并自己存储所有内容。因为即使是python也无法理解或处理此类映射请求。如果该文件通过网络共享驱动器可用,您可能可以绕过此问题。Windows不支持x32 ABI。我认为这是linux特有的。您可能指的是x86/i386/i686/win32,而不是x32。
%run -i scratch.py
working with 30000000 elements
number bytes required 0.240000 GB
works
working with 300000000 elements
number bytes required 2.400000 GB
works
working with 3000000000 elements
number bytes required 24.000000 GB
works
working with 30000000000 elements
number bytes required 240.000000 GB
IOError(28, 'No space left on device')
working with 300000000000 elements
number bytes required 2400.000000 GB
IOError(28, 'No space left on device')
working with 3000000000000 elements
number bytes required 24000.000000 GB
IOError(22, 'Invalid argument')