如何在.ebextensions中运行Python脚本?
我的如何在.ebextensions中运行Python脚本?,python,amazon-web-services,amazon-elastic-beanstalk,Python,Amazon Web Services,Amazon Elastic Beanstalk,我的.ebextensions/00.commands.config如下所示: container_commands: 00_download_models: command: "./download.py" 我的download.py有: #!/usr/bin/env python3 print('now') 但是在/var/log/cfn init.log中,我有: 2020-06-25 17:19:34,933 [ERROR] ------------
.ebextensions/00.commands.config
如下所示:
container_commands:
00_download_models:
command: "./download.py"
我的download.py
有:
#!/usr/bin/env python3
print('now')
但是在/var/log/cfn init.log
中,我有:
2020-06-25 17:19:34,933 [ERROR] -----------------------BUILD FAILED!------------------------
2020-06-25 17:19:34,933 [ERROR] Unhandled exception during build: Command 00_download_models failed
Traceback (most recent call last):
File "/opt/aws/bin/cfn-init", line 171, in <module>
worklog.build(metadata, configSets)
File "/usr/lib/python2.7/site-packages/cfnbootstrap/construction.py", line 129, in build
Contractor(metadata).build(configSets, self)
File "/usr/lib/python2.7/site-packages/cfnbootstrap/construction.py", line 530, in build
self.run_config(config, worklog)
File "/usr/lib/python2.7/site-packages/cfnbootstrap/construction.py", line 542, in run_config
CloudFormationCarpenter(config, self._auth_config).build(worklog)
File "/usr/lib/python2.7/site-packages/cfnbootstrap/construction.py", line 260, in build
changes['commands'] = CommandTool().apply(self._config.commands)
File "/usr/lib/python2.7/site-packages/cfnbootstrap/command_tool.py", line 117, in apply
raise ToolError(u"Command %s failed" % name)
ToolError: Command 00_download_models failed
2020-06-2517:19:34933[错误]--------------------------构建失败------------------------
2020-06-25 17:19:34933[错误]生成期间未处理的异常:命令00_下载_模型失败
回溯(最近一次呼叫最后一次):
文件“/opt/aws/bin/cfn init”,第171行,在
worklog.build(元数据、配置集)
文件“/usr/lib/python2.7/site packages/cfnbootstrap/construction.py”,第129行,内部版本
承包商(元数据).构建(配置集,自身)
文件“/usr/lib/python2.7/site packages/cfnbootstrap/construction.py”,第530行,内部版本
self.run\u配置(配置,工作日志)
文件“/usr/lib/python2.7/site packages/cfnbootstrap/construction.py”,第542行,在运行配置中
CloudFormationCarpenter(配置,self.\u auth\u config).build(工作日志)
文件“/usr/lib/python2.7/site packages/cfnbootstrap/construction.py”,第260行,内部版本
更改['commands']=CommandTool().apply(self.\u config.commands)
文件“/usr/lib/python2.7/site packages/cfnbootstrap/command_tool.py”,第117行,在apply中
raise ToolError(u)命令%s失败“%name”
工具错误:命令00\u下载\u模型失败
看起来很直截了当,所以我不知道我做错了什么。根据您所描述的,您的脚本和技巧没有任何问题 我使用自己的电子商务环境(运行Python 3.7的64位Amazon Linux 2 v3.0.3;单实例)验证了它 为了确认,我使用了以下方法: .ebextensions/50_commands.config
container_commands:
00_download_models:
command: "./download.py"
/download.py位于我的zip包的根目录中(不在.ebextensions
文件夹中):
在创建EB环境之前,我还通过在本地工作站上执行以下操作,确保/download.py
具有执行权限:
chmod +x ./download.py
我的download.py
是:
#!/usr/bin/env python3
import datetime
import torch
import torch.nn.functional as F
from transformers import (
CTRLLMHeadModel,
CTRLTokenizer,
GPT2LMHeadModel,
GPT2Tokenizer,
TransfoXLLMHeadModel,
TransfoXLTokenizer,
XLMTokenizer,
XLMWithLMHeadModel,
XLNetLMHeadModel,
XLNetTokenizer,
)
if __name__ == "__main__":
with open('/tmp/test.txt', 'w') as f:
f.write('Starting download', datetime.datetime.now().time(), '\n')
GPT2LMHeadModel.from_pretrained('distilgpt2')
f.write('DistilModel', datetime.datetime.now().time(), '\n')
GPT2LMHeadModel.from_pretrained('gpt2-xl')
f.write('GPT2-XL', datetime.datetime.now().time(), '\n')
GPT2LMHeadModel.from_pretrained('gpt2-medium')
f.write('GPT2-medium', datetime.datetime.now().time(), '\n')
CTRLLMHeadModel.from_pretrained('ctrl')
f.write('CTRL', datetime.datetime.now().time(), '\n')
GPT2Tokenizer.from_pretrained('distilgpt2'),
GPT2Tokenizer.from_pretrained('gpt2-xl'),
GPT2Tokenizer.from_pretrained('gpt2-medium'),
CTRLTokenizer.from_pretrained('ctrl')
f.write('Finished download\n')
您能否尝试将其更改为
命令:“./download.py”
或命令:“python3 download.py”
?与命令的问题相同:“./download.py”
尝试指定download.py的完整路径。如何安装download.py?您在cfn init中使用文件资源吗?看起来我没有执行chmod
步骤。谢谢实际上-如果download.py
中的工作很大,它似乎也会失败。@Shamoon您有“大”download.py的例子吗?我也可以尝试在我的电子商务环境上复制这个问题contents@Shamoontorch
是我尝试运行它的唯一先决条件吗?
#!/usr/bin/env python3
import datetime
import torch
import torch.nn.functional as F
from transformers import (
CTRLLMHeadModel,
CTRLTokenizer,
GPT2LMHeadModel,
GPT2Tokenizer,
TransfoXLLMHeadModel,
TransfoXLTokenizer,
XLMTokenizer,
XLMWithLMHeadModel,
XLNetLMHeadModel,
XLNetTokenizer,
)
if __name__ == "__main__":
with open('/tmp/test.txt', 'w') as f:
f.write('Starting download', datetime.datetime.now().time(), '\n')
GPT2LMHeadModel.from_pretrained('distilgpt2')
f.write('DistilModel', datetime.datetime.now().time(), '\n')
GPT2LMHeadModel.from_pretrained('gpt2-xl')
f.write('GPT2-XL', datetime.datetime.now().time(), '\n')
GPT2LMHeadModel.from_pretrained('gpt2-medium')
f.write('GPT2-medium', datetime.datetime.now().time(), '\n')
CTRLLMHeadModel.from_pretrained('ctrl')
f.write('CTRL', datetime.datetime.now().time(), '\n')
GPT2Tokenizer.from_pretrained('distilgpt2'),
GPT2Tokenizer.from_pretrained('gpt2-xl'),
GPT2Tokenizer.from_pretrained('gpt2-medium'),
CTRLTokenizer.from_pretrained('ctrl')
f.write('Finished download\n')