在Pipenv-Pyspark中安装模块后出现ModuleNotFound错误

在Pipenv-Pyspark中安装模块后出现ModuleNotFound错误,pyspark,pipenv,pipenv-install,Pyspark,Pipenv,Pipenv Install,我想在当地环境下做一份Pypark的工作 成功设置pipenv并安装模块(numpy)后,代码仍然看不到模块 使用pip而不是pipenv安装库是可行的。我错过了什么 终端输出如下所示 PS C:\Users\user\Desktop\spark\test> pipenv shell Shell for C:\Users\user\.virtualenvs\test-sCQB0P3C already activated.

我想在当地环境下做一份Pypark的工作

成功设置pipenv并安装模块(numpy)后,代码仍然看不到模块

使用pip而不是pipenv安装库是可行的。我错过了什么

终端输出如下所示

PS C:\Users\user\Desktop\spark\test> pipenv shell
Shell for C:\Users\user\.virtualenvs\test-sCQB0P3C already activated.                                                                                                                                                                                                                                                                                            
No action taken to avoid nested environments.    
                                                                                                                                                                                                                                                                                                                      
PS C:\Users\user\Desktop\spark\test> pipenv graph                                                                                                                                                                                                                                                                                                                
numpy==1.20.3                                                                                                                                                                                                                                                                                                                                                              
pipenv==2020.11.15                                                                                                                                                                                                                                                                                                                                                           
- certifi [required: Any, installed: 2020.12.5]                                                                                                                                                                                                                                                                                                                            
- pip [required: >=18.0, installed: 21.1.1]                                                                                                                                                                                                                                                                                                                                
- setuptools [required: >=36.2.1, installed: 56.0.0]                                                                                                                                                                                                                                                                                                                       
- virtualenv [required: Any, installed: 20.4.6]                                                                                                                                                                                                                                                                                                                              
- appdirs [required: >=1.4.3,<2, installed: 1.4.4]                                                                                                                                                                                                                                                                                                                         
- distlib [required: >=0.3.1,<1, installed: 0.3.1]                                                                                                                                                                                                                                                                                                                         
- filelock [required: >=3.0.0,<4, installed: 3.0.12]                                                                                                                                                                                                                                                                                                                       
- six [required: >=1.9.0,<2, installed: 1.16.0]                                                                                                                                                                                                                                                                                                                          
- virtualenv-clone [required: >=0.2.5, installed: 0.5.4]                                                                                                                                                                                                                                                                                                                 
pyspark==2.4.0                                                                                                                                                                                                                                                                                                                                                               
- py4j [required: ==0.10.7, installed: 0.10.7]   
                                                                                                                                                                                                                                                                                                                    
PS C:\Users\user\Desktop\spark\test> spark-submit --master local[*] --files 
configs\etl_config.json jobs\etl_job.py
                                                                                                                                                                                                                                          
Traceback (most recent call last):                                                                                                                                                                                                                                                                                                                                           
File "C:/Users/user/Desktop/spark/test/jobs/etl_job.py", line 40, in <module>                                                                                                                                                                                                                                                                                      
from dependencies.class import XLoader                                                                                                                                                                                                                                                                                                                             
File "C:\Users\user\Desktop\spark\test\dependencies\X.py", line 2, in <module>                                                                                                                                                                                                                                                                                
import numpy as np                                                                                                                                                                                                                                                                                                                                                     
ModuleNotFoundError: No module named 'numpy'
PS C:\Users\user\Desktop\spark\test>pipenv shell
C:\Users\user\.virtualenvs\test-sCQB0P3C的Shell已存在激活。
未采取任何措施来避免嵌套环境。
PS C:\Users\user\Desktop\spark\test>pipenv图形   
numpy==1.20.3             
pipenv==2020.11.15               
-certifi[需要:任何,安装:2020.12.5]                                                                                                                                                                                                                                                                                                                            
-pip[必需:>=18.0,已安装:21.1.1]                                                                                                                                                                                                                                                                                                                                
-setuptools[必需:>=36.2.1,已安装:56.0.0]                                                                                                                                                                                                                                                                                                                       
-virtualenv[必需:任何,已安装:20.4.6]                                                                                                                                                                                                                                                                                                                              
-appdirs[必需:>=1.4.3,=0.3.1,=3.0.0,=1.9.0,=0.2.5,已安装:0.5.4]                                                                                                                                                                                                                                                                                                                 
pyspark==2.4.0               
-py4j[必需:==0.10.7,已安装:0.10.7]
PS C:\Users\user\Desktop\spark\test>spark提交--主本地[*]--文件
configs\etl\u config.json作业\etl\u作业.py
回溯(最近一次呼叫最后一次):                                                                                                                                                                                                                                                                                                                                           
文件“C:/Users/user/Desktop/spark/test/jobs/etl_job.py”,第40行,在
从dependencies.class导入XLoader