Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/linux/25.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Linux 如果在csv文件中找到匹配项,如何获取整个记录_Linux_Shell_Csv - Fatal编程技术网

Linux 如果在csv文件中找到匹配项,如何获取整个记录

Linux 如果在csv文件中找到匹配项,如何获取整个记录,linux,shell,csv,Linux,Shell,Csv,如果找到匹配项,我想从csv文件中获取记录 样本数据: 当我grep一个像A1这样的关键字时,所有作为A1的记录都应该显示为输出 输出: 我的命令不起作用: 试试这个 $ egrep "A1|NAME" demo.csv "ID","NAME","CITY" "001","A1","B1" "001","A1","B1" "001","A1","B1" "001","A1","B1" 对于您的问题,请检查上的答案 上面的链接可能会有所帮助 除此之外,我还使用 我通过迭代excel工作表获取了

如果找到匹配项,我想从csv文件中获取记录

样本数据:

当我grep一个像A1这样的关键字时,所有作为A1的记录都应该显示为输出

输出:

我的命令不起作用:

试试这个

$ egrep "A1|NAME" demo.csv 
"ID","NAME","CITY"
"001","A1","B1"
"001","A1","B1"
"001","A1","B1"
"001","A1","B1"

对于您的问题,请检查上的答案

上面的链接可能会有所帮助

除此之外,我还使用

我通过迭代excel工作表获取了行,Panda也可以使用CSV

我使用了以下代码:

df1 = pd.read_excel('Path of xlsx') #   

df = pd.DataFrame(df1, columns= [<Name of Columns>])

pd.DataFrame(df1,columns['From','To'])

for i in range(len(df)): 
from_postal_code = df.loc[i, "From"]
to_postal_code = df.loc[i, "To"]
print(from_postal_code)
print(to_postal_code)
希望这有帮助

使用csvgrep始终是一个好主意,在可能的情况下,使用能够理解结构化格式而不是通用格式的工具:

$ csvgrep -c 2 -m A1 demo.csv
ID,NAME,CITY
001,A1,B1
001,A1,B1
001,A1,B1
001,A1,B1
$ egrep "A1|NAME" demo.csv 
"ID","NAME","CITY"
"001","A1","B1"
"001","A1","B1"
"001","A1","B1"
"001","A1","B1"
df1 = pd.read_excel('Path of xlsx') #   

df = pd.DataFrame(df1, columns= [<Name of Columns>])

pd.DataFrame(df1,columns['From','To'])

for i in range(len(df)): 
from_postal_code = df.loc[i, "From"]
to_postal_code = df.loc[i, "To"]
print(from_postal_code)
print(to_postal_code)
$ csvgrep -c 2 -m A1 demo.csv
ID,NAME,CITY
001,A1,B1
001,A1,B1
001,A1,B1
001,A1,B1