Warning: file_get_contents(/data/phpspider/zhask/data//catemap/7/user-interface/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 3.x python函数不返回任何内容_Python 3.x_Function_Beautifulsoup - Fatal编程技术网

Python 3.x python函数不返回任何内容

Python 3.x python函数不返回任何内容,python-3.x,function,beautifulsoup,Python 3.x,Function,Beautifulsoup,我正在编写一个函数,该函数应该能够从html文件中包含的人名中删除链接。我想事先警告你们,我知道这很难看,我知道它很长,我也知道这肯定不是最优雅的方式。但这是我能想到的最好的方法,因为我是python的初学者 这是我的密码: file = 'C:/users/me/webpage.html' def get_links(file): import codecs from bs4 import BeautifulSoup as bs f = codecs.open(fil

我正在编写一个函数,该函数应该能够从html文件中包含的人名中删除链接。我想事先警告你们,我知道这很难看,我知道它很长,我也知道这肯定不是最优雅的方式。但这是我能想到的最好的方法,因为我是python的初学者

这是我的密码:

file = 'C:/users/me/webpage.html'

def get_links(file):
    import codecs
    from bs4 import BeautifulSoup as bs
    f = codecs.open(filename, 'r', 'utf-8')
    soup = bs(f.read(),'lxml')
    link_list = []
    A_list = soup.find(string = 'Rob Agerbeek').find_parent('ul').find_all(title = ['Rob Agerbeek', 'Albert Ammons', 'Andrews Sisters', 'Winifred Atwell'])
    A_links = [i.get('href') for i in A_list]
    link_list += A_links
    B_list = soup.find(string = 'Bob Baldori').find_parent('ul').find_all(title = ['Bob Baldori', 'Marcia Ball', 'Deanna Bogart', 'James Booker', 'Eden Brent', 'Dave Brubeck'])
    B_links = [i.get('href') for i in B_list]
    link_list += B_links
    C_list = soup.find(string = 'Francis Craig').find_parent('ul').find_all(title = ['Commander Cody and His Lost Planet Airmen', 'Francis Craig', 'James Crutchfield', 'Ray Charles'])
    C_links = [i.get('href') for i in C_list]
    link_list += C_links
    D_list = soup.find(string = 'Francis Craig').find_parent('ul').find_all(title = ['Caroline Dahl', 'Cow Cow Davenport', 'Blind John Davis', 'Neville Dickie', 'Fats Domino', 'Floyd Domino', 'Dorothy Donegan', 'Georgia Tom Dorsey', 'Dr. John', 'Champion Jack Dupree', 'Big Joe Duskin'])
    D_links = [i.get('href') for i in D_list]
    link_list +=D_links
    E_list = soup.find(string = 'William Ezell').find_parent('ul').find_all(title = ['William Ezell'])
    E_links = [i.get('href') for i in E_list]
    link_list += E_links
    F_list = soup.find(string = 'Wayne Federman').find_parent('ul').find_all(title = ['Wayne Federman', 'Ella Fitzgerald', 'Frankie Ford', 'Ernie Freeman', 'Keith Emerson'])
    F_links = [i.get('href') for i in F_list]
    link_list +=F_links
    G_list = soup.find(string = 'Blind Leroy Garnett').find_parent('ul').find_all(title = ['Blind Leroy Garnett', 'Harry Gibson', 'Harry Gibson'])
    G_links = [i.get('href') for i in G_list]
    link_list +=G_links
    H_list = soup.find(string = 'Willie Hall').find_parent('ul').find_all(title = ['Willie Hall', 'Jools Holland', 'Camille Howard', 'Bob Hall', 'Henri Herbert', 'John Lee Hooker', 'Nicky Hopkins'])
    H_links = [i.get('href') for i in H_list]
    link_list += H_links
    J_list = soup.find(string = 'Elton John').find_parent('ul').find_all(title = ['Elton John', 'Pete Johnson', 'Louis Jordan'])
    J_links = [i.get('href') for i in J_list]
    link_list += J_links
    K_list = soup.find(string = 'Michael Kaeshammer').find_parent('ul').find_all(title = ['Michael Kaeshammer', 'Shizuko Kasagi', 'Joe Krown'])
    K_links = [i.get('href') for i in K_list]
    link_list += K_links
    L_list = soup.find(string = 'Booker T. Laury').find_parent('ul').find_all(title = ['Booker T. Laury', 'Jerry Lee Lewis', 'Meade Lux Lewis', 'Liberace', 'Little Richard', 'Little Willie Littlefield', 'Cripple Clarence Lofton', 'Professor Longhair'])
    L_links = [i.get('href') for i in L_list]
    link_list += L_links
    M_list = soup.find(string = 'Memphis Slim').find_parent('ul').find_all(title = ['Memphis Slim', 'Big Maceo Merriweather', 'Moon Mullican'])
    M_links = [i.get('href') for i in M_list]
    link_list += M_links
    N_list = soup.find(string = 'Romeo Nelson').find_parent('ul').find_all(title = ['Romeo Nelson', 'Charlie Norman'])
    N_links = [i.get('href') for i in N_list]
    link_list += N_links
    P_list = soup.find(string = 'Bill Payne').find_parent('ul').find_all(title = ['Bill Payne', 'Oscar Peterson', 'Piano Red', 'Pinetop Perkins', 'Ross Petot', 'Sammy Price', 'Professor Longhair'])
    P_links = [i.get('href') for i in P_list]
    link_list += P_links
    R_list = soup.find(string = 'Boogie Woogie Red').find_parent('ul').find_all(title = ['Boogie Woogie Red', 'Maurice Rocco', 'Walter Roland', 'Leon Russell'])
    R_links = [i.get('href') for i in R_list]
    link_list += R_links
    S_list = soup.find(string = 'Ulf Sandström').find_parent('ul').find_all(title = ['Ulf Sandström', 'Bob Seeley', 'Luca Sestak', 'Omar Shariff', 'Robert Shaw', 'Freddie Slack', 'Huey "Piano" Smith', 'Clarence "Pine Top" Smith', 'Charlie Spand', 'Otis Spann', 'Speckled Red', 'Roosevelt Sykes'])
    S_links = [i.get('href') for i in S_list]
    link_list += S_links
    T_list = soup.find(string = 'Gene Taylor').find_parent('ul').find_all(title = ['Gene Taylor', 'Montana Taylor', 'George W. Thomas', 'Hersal Thomas', 'Allen Toussaint', 'T. Rex', 'Stephanie Trick', 'Big Joe Turner', 'Ike Turner'])
    T_links = [i.get('href') for i in T_list]
    link_list += T_links
    W_list = soup.find(string = 'Rick Wakeman').find_parent('ul').find_all(title = ['Rick Wakeman', 'Tuts Washington', 'Kenny "Blues Boss" Wayne', 'Vince Weber', 'Robert Wells', 'Clarence Williams', 'Jabo Williams', 'Mitch Woods'])
    W_links = [i.get('href') for i in W_list]
    link_list += W_links
    Y_list = soup.find(string = 'Jimmy Yancey').find_parent('ul').find_all(title = ['Jimmy Yancey'])
    Y_links = [i.get('href') for i in Y_list]
    link_list += Y_links
    Z_list = soup.find(string = 'Silvan Zingg').find_parent('ul').find_all(title = ['Silvan Zingg', 'Axel Zwingenberger', 'ZZ Top'])
    Z_links = [i.get('href') for i in Z_list]
    link_list += Z_links

    return link_list
这里的主要问题是每一行都是独立工作的,我用这段代码获得了所需的输出,但是当我想在函数中运行它时,它不会返回任何内容,没有错误,没有错误,没有任何内容,没有任何内容,完全没有结果,只有一个空白输出。如果有人能在这里为我点灯,我将不胜感激


谢谢大家

因为您的函数返回了正确的值,所以您将此值传递给另一个函数的方式可能存在错误。 一种方法是生成另一个变量。或者只是传递方法本身

otherFunction(get_links(file)) 


也许不是,但在我学习的时候,返回和局部变量给了我一段艰难的时光,所以我想值得一提

所有函数都至少返回None或引发错误。这一个返回
链接列表
所以。。。告诉我们你怎么知道它不存在。在添加
B_链接
后修剪示例,这样我们就不必费力地阅读额外的内容,并展示如何使用它,这样我们就可以看到问题所在。你能给我们展示
webpage.html
文件的一些示例,这样我们就可以看到格式,或许可以帮助编写更高效的代码吗?这只是猜测。OP没有提供足够的信息来说明。
link = get_links(file)
otherFunction(link)