Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/352.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 将Cythonization添加到setup.py时,Travis日志颜色将消失_Python_Continuous Integration_Travis Ci_Cythonize - Fatal编程技术网

Python 将Cythonization添加到setup.py时,Travis日志颜色将消失

Python 将Cythonization添加到setup.py时,Travis日志颜色将消失,python,continuous-integration,travis-ci,cythonize,Python,Continuous Integration,Travis Ci,Cythonize,从中,我们可以看到一些构建,所有其他构建在运行tox时都有带有passed/failed/skip颜色的日志,例如 为py-travis环境使用tox.ini配置: [tox] envlist = py{27,35,36,37} pypy py{27,35,36}-nodeps py{27,35,36}-jenkins py-cythonized py-travis [testenv] ; simplify numpy installation

从中,我们可以看到一些构建,所有其他构建在运行tox时都有带有passed/failed/skip颜色的日志,例如

py-travis
环境使用
tox.ini
配置:

[tox]
envlist =
    py{27,35,36,37}
    pypy
    py{27,35,36}-nodeps
    py{27,35,36}-jenkins
    py-cythonized
    py-travis

[testenv]
; simplify numpy installation
setenv =
    LAPACK=
    ATLAS=None
    PYTHONWARNINGS=ignore

; Copy all environment variables to the tox test environment
passenv = *

deps =
    numpy
    nose >= 1.2.1
    coverage
    text-unidecode
    twython
    pyparsing
    python-crfsuite
    rednose

changedir = nltk/test
commands =
    ; scipy and scikit-learn requires numpy even to run setup.py so
    ; they can't be installed in one command
    pip install scipy scikit-learn

    ; python runtests.py --with-coverage --cover-inclusive --cover-package=nltk --cover-html --cover-html-dir={envdir}/docs []
    python runtests.py []

commands =
    python runtests.py []


[testenv:py-travis]
extras = all
setenv =
    NLTK_DATA = {homedir}/nltk_data/
commands = {toxinidir}/tools/travis/coverage-pylint.sh
但是当
setup.py
的cythonization启动时,tox和travis的配置似乎是一样的

[testenv:py-cythonized]
deps =
    Cython >= 0.28.5
setenv =
    CYTHONIZE_NLTK = true
    NLTK_DATA = {homedir}/nltk_data/
extras = all
commands = {toxinidir}/tools/travis/coverage-pylint.sh
运行生成时,颜色将消失:

安装
setup.py
对于
py-travis
py-cynthonized
构建完全相同:

# Work around mbcs bug in distutils.
# http://bugs.python.org/issue10945
import codecs
try:
    codecs.lookup('mbcs')
except LookupError:
    ascii = codecs.lookup('ascii')
    func = lambda name, enc=ascii: {True: enc}.get(name == 'mbcs')
    codecs.register(func)

import os

# Use the VERSION file to get NLTK version
version_file = os.path.join(os.path.dirname(__file__), 'nltk', 'VERSION')
with open(version_file) as fh:
    nltk_version = fh.read().strip()

# setuptools
from setuptools import setup, find_packages

# Specify groups of optional dependencies
extras_require = {
    'machine_learning': ['gensim', 'numpy', 'python-crfsuite', 'scikit-learn', 'scipy'],
    'plot': ['matplotlib'],
    'tgrep': ['pyparsing'],
    'twitter': ['twython'],
    'corenlp': ['requests'],
}

# Add a group made up of all optional dependencies
extras_require['all'] = set(
    package for group in extras_require.values() for package in group
)

MODULES_TO_COMPILE = [
    'nltk.grammar',
    'nltk.parse.chart',
    'nltk.tokenize.*',
    'nltk.probability',
    'nltk.util',
    'nltk.stem.*',
    'nltk.lm.*',
    'nltk.translate.*',
    'nltk.tbl.*',
    'nltk.sentiment.*',
    'nltk.cluster.*',
    'nltk.classify.*',
    'nltk.metrics.*',
    'nltk.chunk.*',
    'nltk.sem.*',

]


def compile_modules(modules):
    """
    Compile the named modules using Cython, using the clearer Python 3 semantics.
    """
    import Cython
    from Cython.Build import cythonize
    files = [name.replace('.', os.path.sep) + '.py' for name in modules]
    print("Compiling %d modules using Cython %s" % (len(modules), Cython.__version__))
    return cythonize(files, language_level=3)


if os.getenv('CYTHONIZE_NLTK') == 'true':
    ext_modules = compile_modules(MODULES_TO_COMPILE)
else:
    ext_modules = None

setup(
    name="nltk",
    description="Natural Language Toolkit",
    version=nltk_version,
    url="http://nltk.org/",
    long_description="""\
The Natural Language Toolkit (NLTK) is a Python package for
natural language processing.  NLTK requires Python 2.7, 3.5, 3.6, or 3.7.""",
    license="Apache License, Version 2.0",
    keywords=[
        'NLP',
        'CL',
        'natural language processing',
        'computational linguistics',
        'parsing',
        'tagging',
        'tokenizing',
        'syntax',
        'linguistics',
        'language',
        'natural language',
        'text analytics',
    ],
    maintainer="Steven Bird",
    maintainer_email="stevenbird1@gmail.com",
    author="Steven Bird",
    author_email="stevenbird1@gmail.com",
    classifiers=[
        'Development Status :: 5 - Production/Stable',
        'Intended Audience :: Developers',
        'Intended Audience :: Education',
        'Intended Audience :: Information Technology',
        'Intended Audience :: Science/Research',
        'License :: OSI Approved :: Apache Software License',
        'Operating System :: OS Independent',
        'Programming Language :: Python :: 2.7',
        'Programming Language :: Python :: 3.5',
        'Programming Language :: Python :: 3.6',
        'Programming Language :: Python :: 3.7',
        'Topic :: Scientific/Engineering',
        'Topic :: Scientific/Engineering :: Artificial Intelligence',
        'Topic :: Scientific/Engineering :: Human Machine Interfaces',
        'Topic :: Scientific/Engineering :: Information Analysis',
        'Topic :: Text Processing',
        'Topic :: Text Processing :: Filters',
        'Topic :: Text Processing :: General',
        'Topic :: Text Processing :: Indexing',
        'Topic :: Text Processing :: Linguistic',
    ],
    package_data={'nltk': ['test/*.doctest', 'VERSION']},
    install_requires=[
        'six',
        'singledispatch; python_version < "3.4"'
    ],
    extras_require=extras_require,
    packages=find_packages(),
    ext_modules=ext_modules,
    zip_safe=False,  # since normal files will be present too?
)
#解决distutils中的mbcs错误。
# http://bugs.python.org/issue10945
导入编解码器
尝试:
codecs.lookup('mbcs')
除LookupError外:
ascii=编解码器。查找('ascii')
func=lambda name,enc=ascii:{True:enc}.get(name=='mbcs')
编解码器.寄存器(func)
导入操作系统
#使用版本文件获取NLTK版本
version\u file=os.path.join(os.path.dirname(\uu file\uuuuuuu),'nltk','version')
打开(版本文件)作为fh:
nltk_version=fh.read().strip()
#设置工具
从setuptools导入设置中,查找\u包
#指定可选依赖项的组
额外要求={
‘机器学习’:[‘gensim’、‘numpy’、‘python crfsuite’、‘scikit学习’、‘scipy’],
'绘图':['matplotlib'],
'tgrep':['pyparsing'],
“twitter”:[“twython”],
“corenlp”:[“请求”],
}
#添加由所有可选依赖项组成的组
额外要求['all']=设置(
extras_require中的组的包。组中的包的值()
)
模块\u到\u编译=[
“nltk.语法”,
“nltk.parse.chart”,
“nltk.tokenize.*”,
“nltk.概率”,
“nltk.util”,
“nltk.stem.*”,
“nltk.lm.*”,
“nltk.translate.*”,
“nltk.tbl.*”,
“nltk.情绪。*”,
“nltk.cluster.*”,
“nltk.classify.*”,
“nltk.metrics.*”,
“nltk.chunk.*”,
“nltk.sem.*”,
]
def编译_模块(模块):
"""
使用Cython编译命名模块,使用更清晰的Python3语义。
"""
进口赛昂
从Cython.Build导入cythonize
files=[name.replace('.',os.path.sep)+'.py'表示模块中的名称]
打印(“使用Cython%s”%编译%d个模块(len(模块),Cython.\uuuuuuuu版本))
返回cythonize(文件,语言级别=3)
如果os.getenv('CYTHONIZE_NLTK')=='true':
ext_模块=编译模块(模块到编译)
其他:
ext_模块=无
设置(
name=“nltk”,
description=“自然语言工具包”,
版本=nltk_版本,
url=”http://nltk.org/",
long_description=“”\
自然语言工具包(NLTK)是一个用于
自然语言处理。NLTK需要Python 2.7、3.5、3.6或3.7。“”,
license=“Apache许可证,版本2.0”,
关键词=[
“NLP”,
“CL”,
“自然语言处理”,
“计算语言学”,
“解析”,
“标记”,
“标记化”,
“语法”,
“语言学”,
“语言”,
“自然语言”,
“文本分析”,
],
maintainer=“Steven Bird”,
维护者_电子邮件=”stevenbird1@gmail.com",
作者=“史蒂文·伯德”,
作者_电子邮件=”stevenbird1@gmail.com",
分类器=[
“发展状况:5-生产/稳定”,
'目标受众::开发者',
“目标受众:教育”,
“目标受众:信息技术”,
“目标受众:科学/研究”,
“许可证::OSI批准::Apache软件许可证”,
'操作系统::独立于操作系统',
'编程语言::Python::2.7',
'编程语言::Python::3.5',
'编程语言::Python::3.6',
'编程语言::Python::3.7',
“主题:科学/工程”,
“主题:科学/工程:人工智能”,
“主题:科学/工程:人机界面”,
“主题:科学/工程:信息分析”,
'主题::文本处理',
'主题::文本处理::过滤器',
'主题::文本处理::常规',
'主题::文本处理::索引',
'主题::文本处理::语言',
],
package_data={'nltk':['test/*.doctest','VERSION']},
安装所需的=[
“六”,
'singledispatch;python_版本<“3.4”'
],
额外要求=额外要求,
packages=find_packages(),
ext_模块=ext_模块,
zip_safe=False,#因为也会出现普通文件?
)
为什么Cythonized构建的颜色会消失?

如何为Cythonized构建启用颜色?


对于某些背景,代码来自
nltk
库,cynthonization测试/构建的完整分支位于

为什么Cythonized构建的颜色会消失

因为
py cythonized
环境没有安装
nose
(这就是为什么测试首先使用stdlib的
unittest
rednose
(nose输出着色)运行的原因

发生这种情况的原因是env已覆盖
deps
py-travis
不声明任何自己的dep,因此它从全局
testenv
配置继承
deps
设置
py cythonized
需要Cython,因此它重新定义了
deps
列表,丢失了执行测试所需的所有包

如何为Cythonized构建启用颜色

将依赖项从全局
testenv
复制到
py cythonized
。建议的修补程序:

diff --git a/tox.ini b/tox.ini
index a267d9a5a..fa0839b96 100644
--- a/tox.ini
+++ b/tox.ini
@@ -26,6 +26,7 @@ deps =
     pyparsing
     python-crfsuite
     rednose
+    py-cythonized: Cython >= 0.28.5

 changedir = nltk/test
 commands =
@@ -131,8 +132,6 @@ commands =

 # Test Cython compiled installation.
 [testenv:py-cythonized]
-deps =
-    Cython >= 0.28.5
 setenv =
     CYTHONIZE_NLTK = true
     NLTK_DATA = {homedir}/nltk_data/
diff--git a/tox.ini b/tox.ini
索引a267d9a5a..41740e19b 100644
---a/tox.ini
+++b/tox.ini
@@-133,6+133,14@@命令=
[被测者:py cythonized]
副总裁=
Cython>=0.28.5
+努比
+鼻子>=1.2.1
+覆盖范围
+文本单解码
+特威顿
+pyparsing
+pythoncrfsuite
+红鼻子
设置
diff --git a/tox.ini b/tox.ini
index a267d9a5a..fa0839b96 100644
--- a/tox.ini
+++ b/tox.ini
@@ -26,6 +26,7 @@ deps =
     pyparsing
     python-crfsuite
     rednose
+    py-cythonized: Cython >= 0.28.5

 changedir = nltk/test
 commands =
@@ -131,8 +132,6 @@ commands =

 # Test Cython compiled installation.
 [testenv:py-cythonized]
-deps =
-    Cython >= 0.28.5
 setenv =
     CYTHONIZE_NLTK = true
     NLTK_DATA = {homedir}/nltk_data/