Seo 不同类型的机器人用于网站为什么?
大家好,我浏览了很多网站,但是我发现了一个diifrent类型的robots txt。有人能帮我知道为什么使用这种类型的robots txt吗 这个的优点和缺点是什么Seo 不同类型的机器人用于网站为什么?,seo,robots.txt,Seo,Robots.txt,大家好,我浏览了很多网站,但是我发现了一个diifrent类型的robots txt。有人能帮我知道为什么使用这种类型的robots txt吗 这个的优点和缺点是什么 User-Agent: Googlebot-Images Disallow: User-agent: OmniExplorer_Bot Disallow: User-agent: FreeFind Disallow: User-agent: BecomeBot Disallow: User-agent: Nutch
User-Agent: Googlebot-Images
Disallow:
User-agent: OmniExplorer_Bot
Disallow:
User-agent: FreeFind
Disallow:
User-agent: BecomeBot
Disallow:
User-agent: Nutch
Disallow:
User-agent: Jetbot/1.0
Disallow:
User-agent: Jetbot
Disallow:
User-agent: WebVac
Disallow:
User-agent: Stanford
Disallow:
User-agent: naver
Disallow:
User-agent: dumbot
Disallow:
User-agent: Hatena Antenna
Disallow:
User-agent: grub-client
Disallow:
User-agent: grub
Disallow:
User-agent: looksmart
Disallow:
User-agent: WebZip
Disallow:
User-agent: larbin
Disallow:
User-agent: b2w/0.1
Disallow:
User-agent: Copernic
Disallow:
User-agent: psbot
Disallow:
User-agent: Python-urllib
Disallow:
User-agent: NetMechanic
Disallow:
User-agent: URL_Spider_Pro
Disallow:
User-agent: CherryPicker
Disallow:
User-agent: EmailCollector
Disallow:
User-agent: EmailSiphon
Disallow:
User-agent: WebBandit
Disallow:
User-agent: EmailWolf
Disallow:
User-agent: ExtractorPro
Disallow:
User-agent: CopyRightCheck
Disallow:
User-agent: Crescent
Disallow:
User-agent: SiteSnagger
Disallow:
User-agent: ProWebWalker
Disallow:
User-agent: CheeseBot
Disallow:
User-agent: LNSpiderguy
Disallow:
User-agent: Mozilla
Disallow:
User-agent: mozilla
Disallow:
User-agent: mozilla/3
Disallow:
User-agent: mozilla/4
Disallow:
User-agent: mozilla/5
Disallow:
User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows NT)
Disallow:
User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 95)
Disallow:
User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 98)
Disallow:
User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows XP)
Disallow:
User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 2000)
Disallow:
User-agent: ia_archiver
Disallow:
User-agent: ia_archiver/1.6
Disallow:
User-agent: Alexibot
Disallow:
User-agent: Teleport
Disallow:
User-agent: TeleportPro
Disallow:
User-agent: Stanford Comp Sci
Disallow:
User-agent: MIIxpc
Disallow:
User-agent: Telesoft
Disallow:
User-agent: Website Quester
Disallow:
User-agent: moget/2.1
Disallow:
User-agent: WebZip/4.0
Disallow:
User-agent: WebStripper
Disallow:
User-agent: WebSauger
Disallow:
User-agent: WebCopier
Disallow:
User-agent: NetAnts
Disallow:
User-agent: Mister PiX
Disallow:
User-agent: WebAuto
Disallow:
User-agent: TheNomad
Disallow:
User-agent: WWW-Collector-E
Disallow:
User-agent: RMA
Disallow:
User-agent: libWeb/clsHTTP
Disallow:
User-agent: asterias
Disallow:
User-agent: httplib
Disallow:
User-agent: turingos
Disallow:
User-agent: spanner
Disallow:
User-agent: InfoNaviRobot
Disallow:
User-agent: Harvest/1.5
Disallow:
User-agent: Bullseye/1.0
Disallow:
User-agent: Mozilla/4.0 (compatible; BullsEye; Windows 95)
Disallow:
User-agent: Crescent Internet ToolPak HTTP OLE Control v.1.0
Disallow:
User-agent: CherryPickerSE/1.0
Disallow:
User-agent: CherryPickerElite/1.0
Disallow:
User-agent: WebBandit/3.50
Disallow:
User-agent: NICErsPRO
Disallow:
User-agent: Microsoft URL Control - 5.01.4511
Disallow:
User-agent: DittoSpyder
Disallow:
User-agent: Foobot
Disallow:
User-agent: WebmasterWorldForumBot
Disallow:
User-agent: SpankBot
Disallow:
User-agent: BotALot
Disallow:
User-agent: lwp-trivial/1.34
Disallow:
User-agent: lwp-trivial
Disallow:
User-agent: http://www.WebmasterWorld.com bot
Disallow:
User-agent: BunnySlippers
Disallow:
User-agent: Microsoft URL Control - 6.00.8169
Disallow:
User-agent: URLy Warning
Disallow:
User-agent: Wget/1.6
Disallow:
User-agent: Wget/1.5.3
Disallow:
User-agent: Wget
Disallow:
User-agent: LinkWalker
Disallow:
User-agent: cosmos
Disallow:
User-agent: moget
Disallow:
User-agent: hloader
Disallow:
User-agent: humanlinks
Disallow:
User-agent: LinkextractorPro
Disallow:
User-agent: Offline Explorer
Disallow:
User-agent: Mata Hari
Disallow:
User-agent: LexiBot
Disallow:
User-agent: Web Image Collector
Disallow:
User-agent: The Intraformant
Disallow:
User-agent: True_Robot/1.0
Disallow:
User-agent: True_Robot
Disallow:
User-agent: BlowFish/1.0
Disallow:
User-agent: http://www.SearchEngineWorld.com bot
Disallow:
User-agent: http://www.WebmasterWorld.com bot
Disallow:
User-agent: JennyBot
Disallow:
User-agent: MIIxpc/4.2
Disallow:
User-agent: BuiltBotTough
Disallow:
User-agent: ProPowerBot/2.14
Disallow:
User-agent: BackDoorBot/1.0
Disallow:
User-agent: toCrawl/UrlDispatcher
Disallow:
User-agent: WebEnhancer
Disallow:
User-agent: suzuran
Disallow:
User-agent: VCI WebViewer VCI WebViewer Win32
Disallow:
User-agent: VCI
Disallow:
User-agent: Szukacz/1.4
Disallow:
User-agent: QueryN Metasearch
Disallow:
User-agent: Openfind data gathere
Disallow:
User-agent: Openfind
Disallow:
User-agent: Xenu's Link Sleuth 1.1c
Disallow:
User-agent: Xenu's
Disallow:
User-agent: Zeus
Disallow:
User-agent: RepoMonkey Bait & Tackle/v1.01
Disallow:
User-agent: RepoMonkey
Disallow:
User-agent: Microsoft URL Control
Disallow:
User-agent: Openbot
Disallow:
User-agent: URL Control
Disallow:
User-agent: Zeus Link Scout
Disallow:
User-agent: Zeus 32297 Webster Pro V2.9 Win32
Disallow:
User-agent: Webster Pro
Disallow:
User-agent: EroCrawler
Disallow:
User-agent: LinkScan/8.1a Unix
Disallow:
User-agent: Keyword Density/0.9
Disallow:
User-agent: Kenjin Spider
Disallow:
User-agent: Iron33/1.0.2
Disallow:
User-agent: Bookmark search tool
Disallow:
User-agent: GetRight/4.2
Disallow:
User-agent: FairAd Client
Disallow:
User-agent: Gaisbot
Disallow:
User-agent: Aqua_Products
Disallow:
User-agent: Radiation Retriever 1.1
Disallow:
User-agent: WebmasterWorld Extractor
Disallow:
User-agent: Flaming AttackBot
Disallow:
User-agent: Oracle Ultra Search
Disallow:
User-agent: MSIECrawler
Disallow:
User-agent: PerMan
Disallow:
User-agent: searchpreview
Disallow:
User-agent: sootle
Disallow:
User-agent: es
Disallow:
User-agent: Enterprise_Search/1.0
Disallow:
User-agent: Enterprise_Search
Disallow:
目前,该文件没有做任何有用的事情。实际上,它的作用与:
User-agent: *
Disallow:
空的不允许:
表示“允许一切”,因此您也可以通过使用空的robots.txt文件或根本不使用任何robots.txt文件来获得相同的效果。如果您计划允许每个人对所有内容进行爬网,则无需列出单个用户代理
我只能推测这个文件存在的原因。也许它最初有:
User-agent: *
Disallow: /
作为最后两行,(阻止上面没有明确列出的爬虫),然后有人删除了这两行,以允许一切。或者可能是编写文件的人错误地认为
不允许:
意味着“阻止一切”。就目前而言,该文件根本没有任何用处。实际上,它的作用与:
User-agent: *
Disallow:
空的不允许:
表示“允许一切”,因此您也可以通过使用空的robots.txt文件或根本不使用任何robots.txt文件来获得相同的效果。如果您计划允许每个人对所有内容进行爬网,则无需列出单个用户代理
我只能推测这个文件存在的原因。也许它最初有:
User-agent: *
Disallow: /
作为最后两行,(阻止上面没有明确列出的爬虫),然后有人删除了这两行,以允许一切。或者可能是编写文件的人错误地认为
不允许:
意味着“阻止一切”。就目前而言,该文件根本没有任何用处。实际上,它的作用与:
User-agent: *
Disallow:
空的不允许:
表示“允许一切”,因此您也可以通过使用空的robots.txt文件或根本不使用任何robots.txt文件来获得相同的效果。如果您计划允许每个人对所有内容进行爬网,则无需列出单个用户代理
我只能推测这个文件存在的原因。也许它最初有:
User-agent: *
Disallow: /
作为最后两行,(阻止上面没有明确列出的爬虫),然后有人删除了这两行,以允许一切。或者可能是编写文件的人错误地认为
不允许:
意味着“阻止一切”。就目前而言,该文件根本没有任何用处。实际上,它的作用与:
User-agent: *
Disallow:
空的不允许:
表示“允许一切”,因此您也可以通过使用空的robots.txt文件或根本不使用任何robots.txt文件来获得相同的效果。如果您计划允许每个人对所有内容进行爬网,则无需列出单个用户代理
我只能推测这个文件存在的原因。也许它最初有:
User-agent: *
Disallow: /
作为最后两行,(阻止上面没有明确列出的爬虫),然后有人删除了这两行,以允许一切。或者可能是编写文件的人错误地认为不允许:
意味着“阻止一切”