Python 使用django通道、apscheduler和google oauth2client的内存泄漏
我的代码每X秒提取一次数据,并通过将其推送到WebSocket前端。我正在使用在指定间隔上运行tick函数。但是当我运行它时,即使我所做的只是授权oauth API,内存使用量也会以相当稳定的速度增长。consumers.py中的示例代码:Python 使用django通道、apscheduler和google oauth2client的内存泄漏,python,django,google-oauth,apscheduler,django-channels,Python,Django,Google Oauth,Apscheduler,Django Channels,我的代码每X秒提取一次数据,并通过将其推送到WebSocket前端。我正在使用在指定间隔上运行tick函数。但是当我运行它时,即使我所做的只是授权oauth API,内存使用量也会以相当稳定的速度增长。consumers.py中的示例代码: from channels.sessions import channel_session from apscheduler.schedulers.background import BackgroundScheduler from oauth2client
from channels.sessions import channel_session
from apscheduler.schedulers.background import BackgroundScheduler
from oauth2client import transport
from apiclient.discovery import build
scheduler = BackgroundScheduler()
def tick(group_id):
user = GoogleUser.objects.all()[0]
# gets DjangoORMStorage instance
credentials = get_credentials(user).get()
# THESE TWO LINES SEEM TO CAUSE THE MEMORY LEAK
oauth_http = credentials.authorize(transport.get_http_object())
analytics = build('analytics', 'v3', http=oauth_http)
@channel_session
def ws_connect(message):
# accept socket connection and add channel to group
message.reply_channel.send({"accept": True})
# add channel to websocket channel group
redis_group = Group(group_id, channel_layer=None)
redis_group.add(message.reply_channel)
# schedule job
scheduler.add_job(tick, 'interval', id=slug, kwargs={
'group_id': group_id,
}, seconds=settings.INTERVAL)
scheduler.start()