Django芹菜任务TwythonStreamer SIGSEGV

Django芹菜任务TwythonStreamer SIGSEGV,django,rabbitmq,celery,django-celery,twython,Django,Rabbitmq,Celery,Django Celery,Twython,我有一个Django项目,其中TwythonStreamer连接是在芹菜任务工作者上启动的。随着搜索词的更改,连接将启动并重新加载。但是,在当前状态下,在将项目更新为芹菜3.1.1之前,当此特定任务尝试运行时,它将发出SIGSEGV。我可以在Django Shell中执行与任务相同的命令,并使其正常工作: tu = TwitterUserAccount.objects.first() stream = NetworkStreamer(settings.TWITTER_CONSUMER_KEY,

我有一个Django项目,其中TwythonStreamer连接是在芹菜任务工作者上启动的。随着搜索词的更改,连接将启动并重新加载。但是,在当前状态下,在将项目更新为芹菜3.1.1之前,当此特定任务尝试运行时,它将发出SIGSEGV。我可以在Django Shell中执行与任务相同的命令,并使其正常工作:

tu = TwitterUserAccount.objects.first()
stream = NetworkStreamer(settings.TWITTER_CONSUMER_KEY, settings.TWITTER_CONSUMER_SECRET, tu.twitter_access_token, tu.twitter_access_token_secret)
stream.statuses.filter(track='foo,bar')
但是,在另一个窗口中运行RabbitMQ/Cellery时(在项目的virtualenv中):

celery worker --app=project.app -B -E -l INFO 
并尝试运行:

@task()
def test_network():
  tu = TwitterUserAccount.objects.first()
  stream = NetworkStreamer(settings.TWITTER_CONSUMER_KEY, settings.TWITTER_CONSUMER_SECRET, tu.twitter_access_token, tu.twitter_access_token_secret)
在Django shell中,通过:

test_network.apply_async()
芹菜窗口中出现以下SIGSEVG错误(初始化NetworkStreamer时):

NetworkStreamer只是继承的TwythonStreamer()

除了各种芹菜节拍任务外,我还有其他的芹菜任务运行得很好。正在完成djcelery.setup\u loader()等操作。我尝试过调整各种设置(认为这可能是一个棘手的问题),但我甚至没有传递任何参数。芹菜是如何被设置,命名等

BROKER_URL = 'amqp://'
CELERYBEAT_SCHEDULER = "djcelery.schedulers.DatabaseScheduler"
CELERY_RESULT_ENGINE_OPTIONS = {"echo": True}
CELERY_TASK_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_BACKEND = 'amqp'

# Short lived sessions, disabled by default
CELERY_RESULT_PERSISTENT = True
CELERY_RESULT_BACKEND = 'amqp'
CELERY_TASK_RESULT_EXPIRES = 18000  # 5 hours.
CELERY_SEND_TASK_ERROR_EMAILS = True
版本:

  • Python:2.7.5
  • RabbitMQ:3.3.4
  • Django==1.6.5
  • amqp==1.4.5
  • 台球==3.3.0.18
  • 芹菜==3.1.12
  • django芹菜==3.1.10
  • 花==0.7.0
  • psycopg2==2.5.3
  • pytz==2014.4
  • twython==3.1.2
来自文档:
此库应与librabbitmq兼容。一定要这样。
BROKER_URL = 'amqp://'
CELERYBEAT_SCHEDULER = "djcelery.schedulers.DatabaseScheduler"
CELERY_RESULT_ENGINE_OPTIONS = {"echo": True}
CELERY_TASK_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_BACKEND = 'amqp'

# Short lived sessions, disabled by default
CELERY_RESULT_PERSISTENT = True
CELERY_RESULT_BACKEND = 'amqp'
CELERY_TASK_RESULT_EXPIRES = 18000  # 5 hours.
CELERY_SEND_TASK_ERROR_EMAILS = True