Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/327.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
将Python多处理模块与Boggle解算器一起使用时出错_Python_Multiprocessing_Boggle - Fatal编程技术网

将Python多处理模块与Boggle解算器一起使用时出错

将Python多处理模块与Boggle解算器一起使用时出错,python,multiprocessing,boggle,Python,Multiprocessing,Boggle,我有以下程序,它是一个Boggle解算器: import logging import multiprocessing from random import choice from string import ascii_uppercase def get_grid(size=None, letters=None): if size: grid = {(x, y): choice(ascii_uppercase) for x in range(size[0]) for

我有以下程序,它是一个Boggle解算器:

import logging
import multiprocessing
from random import choice
from string import ascii_uppercase


def get_grid(size=None, letters=None):
    if size:
        grid = {(x, y): choice(ascii_uppercase) for x in range(size[0]) for y in
                range(size[1])}
    elif letters:
        grid = {}
        rows = letters.split()
        for y, row in enumerate(rows):
            for x, letter in enumerate(row):
                grid[x, y] = letter

    return grid


def print_grid(grid):
    s = ''
    for y in range(size[1]):
        for x in range(size[0]):
            s += grid[x, y] + ' '
        s += '\n'

    print s


def get_neighbours():
    neighbours = {}
    for position in grid:
        x, y = position
        positions = [(x - 1, y - 1), (x, y - 1), (x + 1, y - 1), (x + 1, y),
                     (x + 1, y + 1), (x, y + 1), (x - 1, y + 1), (x - 1, y)]
        neighbours[position] = [p for p in positions if
                                0 <= p[0] < size[0] and 0 <= p[1] < size[1]]
    return neighbours


def get_wordlist():
    stems = set()
    wordlist = set()

    with open('words.txt') as f:
        for word in f:
            word = word.strip().upper()
            wordlist.add(word)

            for i in range(len(word)):
                stems.add(word[:i + 1])
    return wordlist, stems


def path_to_word(path):
    return ''.join([grid[p] for p in path])


def search(path):
    word = path_to_word(path)

    if word not in stems:
        return

    if word in wordlist:
        paths.append(path)

    for next_pos in neighbours[path[-1]]:
        if next_pos not in path:
            search(path + [next_pos])


def get_words():
    for position in grid:
        logging.info('searching %s' % str(position))
        search([position])
    return {path_to_word(p) for p in paths}

if __name__ == '__main__':

    logging.basicConfig(level=logging.WARNING)
    size = 4, 4
    grid = get_grid(size=size)
    print_grid(grid)
    neighbours = get_neighbours()
    wordlist, stems = get_wordlist()
    paths = []

    #words = get_words()

    pool = multiprocessing.Pool(processes=4)
    results = pool.map(search, grid)
    words = [path_to_word(p) for p in paths]

    print sorted(words, key=len, reverse=True)
和取消注释:

words = get_words()

我猜多处理会以某种方式改变变量范围?

因为您没有提供words.txt,我们无法测试代码,但是您的
路径到word
引用了
grid
,并且没有在本地定义words.text只是一本英语词典。网格是全局定义的,它在不使用多处理时工作。哦,我明白了。通常,在使用多处理时,您不希望使用全局变量,因为在创建进程时,每个全局变量都会获得单独的副本。在工作进程内对它们所做的任何更改(例如,修改路径)在主进程中都不可见。由于您没有提供words.txt,我们无法测试代码,但您的
path\u to\u word
引用了
grid
,并且没有在本地定义words.text只是一本英语词典。网格是全局定义的,它在不使用多处理时工作。哦,我明白了。通常,在使用多处理时,您不希望使用全局变量,因为在创建进程时,每个全局变量都会获得单独的副本。在辅助进程内对它们所做的任何更改(例如,修改路径)在主进程中都不可见。
pool = multiprocessing.Pool(processes=4)
results = pool.map(search, grid)
words = [path_to_word(p) for p in paths]
words = get_words()