Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/292.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 全文搜索中Google应用程序引擎偏移量大于1000导致错误_Python_Google App Engine - Fatal编程技术网

Python 全文搜索中Google应用程序引擎偏移量大于1000导致错误

Python 全文搜索中Google应用程序引擎偏移量大于1000导致错误,python,google-app-engine,Python,Google App Engine,我正在使用GAE搜索API执行文本搜索,现在的问题是,我可以得到最多2000个(偏移量=1000,限制=2000)结果,而我的数据非常大。那么,我应该怎么做才能获得超过2000个结果呢?使用搜索API并为您的查询构建全文搜索索引。例如,我的代码如下所示并搜索索引: def find_documents(query_string, limit, cursor): try: date_desc = search.SortExpression(expression='date'

我正在使用GAE搜索API执行文本搜索,现在的问题是,我可以得到最多2000个(偏移量=1000,限制=2000)结果,而我的数据非常大。那么,我应该怎么做才能获得超过2000个结果呢?

使用搜索API并为您的查询构建全文搜索索引。例如,我的代码如下所示并搜索索引:

def find_documents(query_string, limit, cursor):
    try:
        date_desc = search.SortExpression(expression='date',
                direction=search.SortExpression.DESCENDING,
                default_value=datetime(1999,01,01))

        hr_desc = search.SortExpression(expression='hour',
                direction=search.SortExpression.DESCENDING,
                default_value=1)

        min_desc = search.SortExpression(expression='minute',
                direction=search.SortExpression.DESCENDING,
                default_value=1)

        # Sort up to  matching results by subject in descending order
        sort = search.SortOptions(expressions=[date_desc, hr_desc,
                                  min_desc], limit=ACCURACY)

        # Set query options
        options = search.QueryOptions(limit=50, cursor=cursor,
                sort_options=sort,
                number_found_accuracy=10000,
              #  returned_fields=['title', 'city', 'region','category', 'adID', 'date','price', 'type', 'company_ad', 'adID', 'cityID','regionID', 'hour','minute'],
             #snippeted_fields=['text']
              )
        query = search.Query(query_string=query_string, options=options)
        index = search.Index(name=_INDEX_NAME)
        logging.debug('query_string i find %s' , str(query.query_string))
        logging.debug('query_options i find %s' , str(query.options))
        # Execute the query
        return index.search(query)

    except search.PutError as e:
        logging.exception('caught PutError %s', e)

    except search.InternalError as e:
        logging.exception('caught InternalError %s', e)

    except search.DeleteError as e:
        logging.exception('caught DeleteError %s', e)

    except search.TransientError as e:
        logging.exception('caught TransientError %s', e)

    except search.InvalidRequest as e:
        logging.exception('caught InvalidError %s', e)

    except search.Error as e:
        logging.exception('caught unknown error  %s', e)

    return None
对于分页,可以使用可以分页大数据集的游标。还有更多信息