Python 如何修复beam.smp内存使用情况?

Python 如何修复beam.smp内存使用情况?,python,ubuntu,flask,rabbitmq,celery,Python,Ubuntu,Flask,Rabbitmq,Celery,我用芹菜和兔子做背景工作。运行程序时,我的内存消耗正在增加。一段时间后,将显示此错误 [2014-08-06 05:17:21,036: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused. Trying again in 6.00 seconds... 因此,我尝试使用以下命令重置rabbitmq sudo rabbitm

我用芹菜和兔子做背景工作。运行程序时,我的内存消耗正在增加。一段时间后,将显示此错误

[2014-08-06 05:17:21,036: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.
Trying again in 6.00 seconds...
因此,我尝试使用以下命令重置rabbitmq

sudo rabbitmqctl stop_app
sudo rabbitmqctl reset
sudo rabbitmqctl start_app
产量

rabbitmqctl list_queues
如下所示显示了许多队列(我已经显示了一些)

我尝试了以下设置是芹菜配置文件

app.conf.update(

    task_ignore_result=True,
    task_store_errors_even_if_ignored = False,
    worker_enable_remote_control=False,
    accept_content = ['json'],
    task_serializer = 'json',
    result_serializer = 'json',
    result_persistent=False,
    result_expires=60
)
不过,我也面临同样的问题

app.conf.update(

    task_ignore_result=True,
    task_store_errors_even_if_ignored = False,
    worker_enable_remote_control=False,
    accept_content = ['json'],
    task_serializer = 'json',
    result_serializer = 'json',
    result_persistent=False,
    result_expires=60
)