Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/290.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
PyCharm覆盖docker容器中用作解释器的PYTHONPATH_Python_Docker_Pycharm - Fatal编程技术网

PyCharm覆盖docker容器中用作解释器的PYTHONPATH

PyCharm覆盖docker容器中用作解释器的PYTHONPATH,python,docker,pycharm,Python,Docker,Pycharm,我有一个docker图像,包含各种位,包括Spark。这是我的Dockerfile: FROM docker-dev.artifactory.company.com/centos:7.3.1611 # set proxy ENV http_proxy http://proxyaddr.co.uk:8080 ENV HTTPS_PROXY http://proxyaddr.co.uk:8080 ENV https_proxy http://proxyaddr.co.uk:8080 RUN yu

我有一个docker图像,包含各种位,包括Spark。这是我的Dockerfile:

FROM docker-dev.artifactory.company.com/centos:7.3.1611

# set proxy
ENV http_proxy http://proxyaddr.co.uk:8080
ENV HTTPS_PROXY http://proxyaddr.co.uk:8080
ENV https_proxy http://proxyaddr.co.uk:8080

RUN yum install -y epel-release
RUN yum install -y gcc
RUN yum install -y krb5-devel
RUN yum install -y python-devel
RUN yum install -y krb5-workstation
RUN yum install -y python-setuptools
RUN yum install -y python-pip
RUN yum install -y xmlstarlet
RUN yum install -y wget java-1.8.0-openjdk
RUN pip install kerberos
RUN pip install numpy
RUN pip install pandas
RUN pip install coverage
RUN pip install tensorflow
RUN wget http://d3kbcqa49mib13.cloudfront.net/spark-1.6.0-bin-hadoop2.6.tgz
RUN tar -xvzf spark-1.6.0-bin-hadoop2.6.tgz -C /opt
RUN ln -s spark-1.6.0-bin-hadoop2.6 /opt/spark


ENV VERSION_NUMBER $(cat VERSION)
ENV JAVA_HOME /etc/alternatives/jre/
ENV SPARK_HOME /opt/spark
ENV PYTHONPATH $SPARK_HOME/python/:$PYTHONPATH
ENV PYTHONPATH $SPARK_HOME/python/lib/py4j-0.9-src.zip:$PYTHONPATH
我可以构建然后运行docker映像,连接到它,并成功导入pyspark库:

$ docker run -d -it sse_spark_build:1.0
09e8aac622d7500e147a6e6db69f806fe093b0399b98605c5da2ff5e0feca07c
$ docker exec -it 09e8aac622d7 python
Python 2.7.5 (default, Nov  6 2016, 00:28:07)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-11)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> from pyspark import SparkContext
>>>import os
>>> os.environ['PYTHONPATH']
'/opt/spark/python/lib/py4j-0.9-src.zip:/opt/spark/python/:'
>>>
注意
PYTHONPATH的值

问题是,如果我使用与解释器相同的docker图像,PyCharm中的行为会有所不同。下面是我如何设置解释器的:

如果在PyCharm中运行Python控制台,则会发生以下情况:

bec0b9189066:python /opt/.pycharm_helpers/pydev/pydevconsole.py 0 0
PyDev console: starting.
import sys; print('Python %s on %s' % (sys.version, sys.platform))
sys.path.extend(['/home/cengadmin/git/dhgitlab/sse/engine/fs/programs/pyspark', '/home/cengadmin/git/dhgitlab/sse/engine/fs/programs/pyspark'])
Python 2.7.5 (default, Nov  6 2016, 00:28:07) 
[GCC 4.8.5 20150623 (Red Hat 4.8.5-11)] on linux2
import os
os.environ['PYTHONPATH']
'/opt/.pycharm_helpers/pydev'
正如您所见,PyCharm已更改PYTHONPATH,这意味着我无法再使用我想要使用的pyspark库:

from pyspark import SparkContext
Traceback (most recent call last):
  File "<input>", line 1, in <module>
ImportError: No module named pyspark

但每次我打开控制台时都要这样做,这很无聊。我不敢相信没有一种方法可以告诉PyCharm附加到PYTHONPATH而不是覆盖它,但是如果有,我就找不到它。有人能提供一些建议吗?如何使用docker图像作为PyCharm的远程解释器并保留PYTHONPATH的值?

您可以在“首选项”中进行设置。请参见下图

您可以设置环境变量,也可以更新起始脚本部分。无论哪种方式更适合你,两者都能胜任

如果需要进一步帮助,请阅读下面的文章

对于pytest运行的控制台,您是如何做到这一点的?我已经尝试在环境变量插槽中添加PYTHONPATH,它适用于控制台,但不适用于pytest。您是否尝试使用
PYTHONPATH=。。。。pytest…
通过在运行命令时设置环境变量,我在默认pytest运行配置中添加了相同的配置,从而使其工作。在我看来,当一切都已经在解释器级别定义时,必须到处复制粘贴配置
import sys
sys.path.append('/opt/spark/python/')
sys.path.append('/opt/spark/python/lib/py4j-0.9-src.zip')