Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/django/19.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
django搜索表单,仅在提交表单时查询_Django_Search - Fatal编程技术网

django搜索表单,仅在提交表单时查询

django搜索表单,仅在提交表单时查询,django,search,Django,Search,这是我的看法 class SingleNewsView(ListView): model = News form_class = SearchForm template_name = "single_news.html" def get(self, request, pk, **kwargs): self.pk = pk self.pub_from = request.GET.get('pub_date_from',False) self.pub_to = req

这是我的看法

class SingleNewsView(ListView):
model = News
form_class = SearchForm
template_name = "single_news.html"



def get(self, request, pk, **kwargs):
    self.pk = pk

    self.pub_from = request.GET.get('pub_date_from',False)
    self.pub_to = request.GET.get('pub_date_to',False)
    self.crawlers = request.GET.get('crawler',False)

    print self.crawlers


    return super(SingleNewsView,self).get(request,pk, **kwargs)



def get_context_data(self, **kwargs):

    context = super(SingleNewsView,self).get_context_data(**kwargs)
    context["form"] = SearchForm#(self.request.GET)
    context["something"] = News.objects.filter(category_id=self.pk).filter(published_date__range=(self.pub_from,self.pub_to), crawler=self.crawlers)


    return context

在这里,当我进入页面时,它不会显示任何新闻,因为没有提供to数据。我希望所有的新闻最初都显示出来,当用户提交表单时,只进行过滤工作。我该怎么做呢?

您应该添加
get\u queryset()
方法,而不是在
get\u context\u data()
中进行过滤。您可以添加如下方法

def get_queryset(self):
    qs = News.objects.filter(category_id=self.pk)
    #you can change this to just support one of the pub_from or pub_to
    if self.pub_from and self.pub_to :
        qs = qs.filter(published_date__range=(self.pub_from,self.pub_to)
    if self.crawler:
        qs = qs.filter(crawler=self.crawlers)