Node.js 为什么每次快递都会要求robots.txt?

Node.js 为什么每次快递都会要求robots.txt?,node.js,express,robots.txt,Node.js,Express,Robots.txt,我正在使用expressjs.com网站上的“快速入门”教程 我正在以以下方式运行我的应用程序(在Windows中): 当我向服务器发出请求时(在本例中,http://localhost:3000/),我在控制台中看到: express:router dispatching GET / +24s express:router query : / +3ms express:router expressInit : / +2ms express:router logger :

我正在使用expressjs.com网站上的“快速入门”教程

我正在以以下方式运行我的应用程序(在Windows中):

当我向服务器发出请求时(在本例中,
http://localhost:3000/
),我在控制台中看到:

  express:router dispatching GET / +24s
  express:router query  : / +3ms
  express:router expressInit  : / +2ms
  express:router logger  : / +3ms
  express:router jsonParser  : / +3ms
  express:router urlencodedParser  : / +4ms
  express:router cookieParser  : / +3ms
  express:router serveStatic  : / +1ms
  express:router router  : / +5ms
  express:router dispatching GET / +2ms
  express:view require "jade" +3ms
  express:view lookup "index.jade" +467ms
  express:view stat "C:\mysites\app\views\index.jade" +3ms
  express:view render "C:\mysites\app\views\index.jade" +3ms
GET / 304 545.682 ms - -
  express:router dispatching GET /robots.txt +58ms
  express:router query  : /robots.txt +2ms
  express:router expressInit  : /robots.txt +3ms
  express:router logger  : /robots.txt +3ms
  express:router jsonParser  : /robots.txt +3ms
  express:router urlencodedParser  : /robots.txt +8ms
  express:router cookieParser  : /robots.txt +1ms
  express:router serveStatic  : /robots.txt +2ms
  express:router dispatching GET /stylesheets/style.css +19ms
  express:router query  : /stylesheets/style.css +2ms
  express:router expressInit  : /stylesheets/style.css +1ms
  express:router logger  : /stylesheets/style.css +4ms
  express:router jsonParser  : /stylesheets/style.css +7ms
  express:router urlencodedParser  : /stylesheets/style.css +4ms
  express:router cookieParser  : /stylesheets/style.css +2ms
  express:router serveStatic  : /stylesheets/style.css +2ms
  express:router router  : /robots.txt +1ms
  express:router dispatching GET /robots.txt +1ms
GET /robots.txt 304 55.631 ms - -
我试图了解每次我发出任何页面/资源请求时,是谁触发了对
robots.txt
的GET请求。我不相信是浏览器。而且它不在呈现页面中:

<!DOCTYPE html><html><head><title>Express</title><link rel="stylesheet" href="/stylesheets/style.css"></head><body><h1>Express</h1><p>Welcome to Express</p></body></html>

这与node.js或express无关。我最近安装了Wappalyzer Chrome扩展。这是我从那时起工作的第一个本地主机站点。当我关闭Wappalyzer时,我不再看到robots.txt的请求。Wappalyzer发出此请求是有道理的,并且它不会在Chrome调试器中显示为正常请求


在调试Express应用程序时,其他人可能会有同样的困惑。

这与node.js或Express无关。我最近安装了Wappalyzer Chrome扩展。这是我从那时起工作的第一个本地主机站点。当我关闭Wappalyzer时,我不再看到robots.txt的请求。Wappalyzer发出此请求是有道理的,并且它不会在Chrome调试器中显示为正常请求

在调试Express应用程序时,其他人可能也会有同样的困惑

<!DOCTYPE html><html><head><title>Express</title><link rel="stylesheet" href="/stylesheets/style.css"></head><body><h1>Express</h1><p>Welcome to Express</p></body></html>
{
  "name": "app",
  "version": "0.0.0",
  "private": true,
  "scripts": {
    "start": "node ./bin/www"
  },
  "dependencies": {
    "cookie-parser": "~1.4.3",
    "debug": "~2.6.9",
    "express": "~4.16.0",
    "http-errors": "~1.6.2",
    "jade": "~1.11.0",
    "morgan": "~1.9.0"
  }
}