Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/unix/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Unix 镜像整个网站并将链接保存在txt文件中_Unix_Wget - Fatal编程技术网

Unix 镜像整个网站并将链接保存在txt文件中

Unix 镜像整个网站并将链接保存在txt文件中,unix,wget,Unix,Wget,是否可以使用wget mirror保存整个网站的所有链接并将其保存在txt文件中 如果可能的话,怎么做?如果没有,是否有其他方法可以做到这一点 编辑: 我试着运行这个: wget -r --spider example.com 得到了这个结果: Spider mode enabled. Check if remote file exists. --2015-10-03 21:11:54-- http://example.com/ Resolving example.com... 93.184

是否可以使用wget mirror保存整个网站的所有链接并将其保存在txt文件中

如果可能的话,怎么做?如果没有,是否有其他方法可以做到这一点

编辑:

我试着运行这个:

wget -r --spider example.com
得到了这个结果:

Spider mode enabled. Check if remote file exists.
--2015-10-03 21:11:54--  http://example.com/
Resolving example.com... 93.184.216.34, 2606:2800:220:1:248:1893:25c8:1946
Connecting to example.com|93.184.216.34|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 1270 (1.2K) [text/html]
Remote file exists and could contain links to other resources -- retrieving.

--2015-10-03 21:11:54--  http://example.com/
Reusing existing connection to example.com:80.
HTTP request sent, awaiting response... 200 OK
Length: 1270 (1.2K) [text/html]
Saving to: 'example.com/index.html'

100%[=====================================================================================================>] 1,270       --.-K/s   in 0s      

2015-10-03 21:11:54 (93.2 MB/s) - 'example.com/index.html' saved [1270/1270]

Removing example.com/index.html.

Found no broken links.

FINISHED --2015-10-03 21:11:54--
Total wall clock time: 0.3s
Downloaded: 1 files, 1.2K in 0s (93.2 MB/s)

(Yes, I also tried using other websites with more internal links)
可以,使用wget的
--spider
选项。命令如下:

wget -r --spider example.com
将使所有链接的深度降低到5(默认值)。然后,您可以将输出捕获到一个文件中,可能会在执行过程中对其进行清理。比如:

wget -r --spider example.com 2>&1 | grep "http://" | cut -f 4 -d " " >> weblinks.txt
将只将链接放入
weblinks.txt
文件中(如果您的wget版本的输出略有不同,则可能需要稍微调整该命令)。

或使用python:

例如

import urllib, re

def do_page(url):
    f = urllib.urlopen(url)
    html = f.read()
    pattern = r"'{}.*.html'".format(url)
    hits = re.findall(pattern, html)
    return hits

if __name__ == '__main__':
    hits = []
    url = 'http://thehackernews.com/'
    hits.extend(do_page(url))
    with open('links.txt', 'wb') as f1:
        for hit in hits:
            f1.write(hit)
输出:


好的,谢谢。我试着复制你写的剧本,但它没有现成的。它创建了一个weblinks.txt文件,但它只在.txt文件中保存了文本(我也尝试输入其他网站)。也许我需要调整它,问题是我不知道怎么做。你能运行第一个命令,看看它给出了什么输出吗?请注意,唯一能让它知道还有哪些其他页面的方法是按照您提供给它的页面上的链接进行操作。如果没有任何指向其他页面的链接,它将找不到任何其他内容。很难在这些评论中添加详细信息,因此您可能会发现用您尝试过的内容的详细信息更新您的问题更容易。是的,这就是它应该如何工作的。实际的网站“example.com”没有内部链接,所以它只返回自身。尝试一个有链接到站点内其他页面的站点,您将获得更多。您是否还希望获得指向外部站点的链接?如果是这样的话,@Randomazer的python脚本可能是一个更好的选择。事实上,有一个类似于您的问题:哪个可能有用。非常感谢!那有帮助!
'http://thehackernews.com/2015/10/adblock-extension.html'
'http://thehackernews.com/p/authors.html'
'http://thehackernews.com/2015/10/adblock-extension.html'
'http://thehackernews.com/2015/10/adblock-extension.html'
'http://thehackernews.com/2015/10/adblock-extension.html'
'http://thehackernews.com/2015/10/adblock-extension.html'
'http://thehackernews.com/2015/10/adblock-extension.html'
'http://thehackernews.com/2015/10/adblock-extension.html'
'http://thehackernews.com/2015/10/data-breach-hacking.html'
'http://thehackernews.com/p/authors.html'
'http://thehackernews.com/2015/10/data-breach-hacking.html'
'http://thehackernews.com/2015/10/data-breach-hacking.html'
'http://thehackernews.com/2015/10/data-breach-hacking.html'
'http://thehackernews.com/2015/10/data-breach-hacking.html'
'http://thehackernews.com/2015/10/data-breach-hacking.html'
'http://thehackernews.com/2015/10/data-breach-hacking.html'
'http://thehackernews.com/2015/10/howto-Freeze-Credit-Report.html'
'http://thehackernews.com/p/authors.html'
'http://thehackernews.com/2015/10/howto-Freeze-Credit-Report.html'
'http://thehackernews.com/2015/10/howto-Freeze-Credit-Report.html'
'http://thehackernews.com/2015/10/howto-Freeze-Credit-Report.html'
'http://thehackernews.com/2015/10/howto-Freeze-Credit-Report.html'
'http://thehackernews.com/2015/10/howto-Freeze-Credit-Report.html'
'http://thehackernews.com/2015/10/howto-Freeze-Credit-Report.html'
'http://thehackernews.com/2015/10/experian-tmobile-hack.html'
'http://thehackernews.com/p/authors.html'
'http://thehackernews.com/2015/10/experian-tmobile-hack.html'
'http://thehackernews.com/2015/10/experian-tmobile-hack.html'
'http://thehackernews.com/2015/10/experian-tmobile-hack.html'
'http://thehackernews.com/2015/10/experian-tmobile-hack.html'
'http://thehackernews.com/2015/10/experian-tmobile-hack.html'
'http://thehackernews.com/2015/10/experian-tmobile-hack.html'
'http://thehackernews.com/2015/10/buy-google-domain.html'
'http://thehackernews.com/p/authors.html'
'http://thehackernews.com/2015/10/buy-google-domain.html'
'http://thehackernews.com/2015/10/buy-google-domain.html'
'http://thehackernews.com/2015/10/buy-google-domain.html'
'http://thehackernews.com/2015/10/buy-google-domain.html'
'http://thehackernews.com/2015/10/buy-google-domain.html'
'http://thehackernews.com/2015/10/buy-google-domain.html'
'http://thehackernews.com/2015/10/android-stagefright-vulnerability.html'
'http://thehackernews.com/p/authors.html'
'http://thehackernews.com/2015/10/android-stagefright-vulnerability.html'
'http://thehackernews.com/2015/10/android-stagefright-vulnerability.html'
'http://thehackernews.com/2015/10/android-stagefright-vulnerability.html'
'http://thehackernews.com/2015/10/android-stagefright-vulnerability.html'
'http://thehackernews.com/2015/10/android-stagefright-vulnerability.html'
'http://thehackernews.com/2015/10/android-stagefright-vulnerability.html'
'http://thehackernews.com/2015/09/digital-india-facebook.html'
'http://thehackernews.com/2015/09/digital-india-facebook.html'
'http://thehackernews.com/2015/10/buy-google-domain.html'
'http://thehackernews.com/2015/10/buy-google-domain.html'
'http://thehackernews.com/2015/09/winrar-vulnerability.html'
'http://thehackernews.com/2015/09/winrar-vulnerability.html'
'http://thehackernews.com/2015/09/chip-mini-computer.html'
'http://thehackernews.com/2015/09/chip-mini-computer.html'
'http://thehackernews.com/2015/09/edward-snowden-twitter.html'
'http://thehackernews.com/2015/09/edward-snowden-twitter.html'
'http://thehackernews.com/2015/10/android-stagefright-vulnerability.html'
'http://thehackernews.com/2015/10/android-stagefright-vulnerability.html'
'http://thehackernews.com/2015/09/quantum-teleportation-data.html'
'http://thehackernews.com/2015/09/quantum-teleportation-data.html'
'http://thehackernews.com/2015/09/iOS-lockscreen-hack.html'
'http://thehackernews.com/2015/09/iOS-lockscreen-hack.html'
'http://thehackernews.com/2015/09/xor-ddos-attack.html'
'http://thehackernews.com/2015/09/xor-ddos-attack.html'
'http://thehackernews.com/2015/09/truecrypt-encryption-software.html'
'http://thehackernews.com/2015/09/truecrypt-encryption-software.html'