Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/309.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 如何在行而不是列中保存CSV文件_Python_Csv_Python Requests - Fatal编程技术网

Python 如何在行而不是列中保存CSV文件

Python 如何在行而不是列中保存CSV文件,python,csv,python-requests,Python,Csv,Python Requests,这是我的密码: import requests from bs4 import BeautifulSoup res = requests.get("https://news.google.com/news/?ned=us&hl=en") soup = BeautifulSoup(res.text,"html.parser") for item in soup.select(".nuEeue"): news_title = (item.text) news_title

这是我的密码:

import requests
from bs4 import BeautifulSoup

res = requests.get("https://news.google.com/news/?ned=us&hl=en")
soup = BeautifulSoup(res.text,"html.parser")

for item in soup.select(".nuEeue"):
    news_title = (item.text)
    news_title = [news_title]
    print (news_title)
    with open('news.csv', 'a', newline='',encoding="utf-8") as f:
            writer = csv.writer(f)
            writer.writerow(news_title)
            f.close()
当我打开csv时,它在一列中显示数据

不过我想在第一排展示一下。 我试图在
打印((新闻标题))
之后添加
end='
,但没有成功。我应该怎么做才能做到这一点

示例:
之前:

a
b
c
d
E

之后:


abcde

您必须导入csv。我不知道你是怎么写出来的

另外,您为新闻标题指定值的两行代码对于您试图实现的目标有点混淆。你想得到标题和文本吗?也许所有的头衔

import requests
import csv
from bs4 import BeautifulSoup

res = requests.get("https://news.google.com/news/?ned=us&hl=en")
soup = BeautifulSoup(res.text,"html.parser")

news_titles=[]  
for item in soup.select(".nuEeue"):
    news_titles.append(item.text)

    print (news_titles)
with open('news.csv', 'a') as f:
    writer csv.writer(f)
    writer.writerow(news_titles)
    f.close()
这个怎么样

import requests
from bs4 import BeautifulSoup

res = requests.get("https://news.google.com/news/?ned=us&hl=en")
soup = BeautifulSoup(res.text,"html.parser")

titles = [item.text for item in soup.select(".nuEeue")] #list comprehension

with open('news.csv', 'a', encoding="utf-8") as f:
    for item in titles:
        f.write(item)
        f.write(",")

但是,我建议您将数据存储在其他东西中,可能是json或数据库。 下面是一个json替代方案:

import datetime
import os
import requests
import json
from bs4 import BeautifulSoup

res = requests.get("https://news.google.com/news/?ned=us&hl=en")
soup = BeautifulSoup(res.text,"html.parser")

titles = [item.text for item in soup.select(".nuEeue") if item.text != ""] # removes blanks too
now = datetime.datetime.now().isoformat()

data = {now:titles} #creates a dictionary with key=time,value=list with titles

# Update data with old inputs if titles.json exist
if os.path.exists('titles.json'):
    with open('titles.json') as f:
        data.update(json.load(f))

# Write to titles.json
with open('titles.json',"w") as f:
    json.dump(data,f)
json在运行几次后(但数据更多)看起来是这样的:


Writerow将行的iterable作为参数。所以一行应该是:[col1,col2,col3…]。因此,与使用for循环写入行不同,您应该使用for循环附加到列表,然后使用writeroware写入该列表您确定要输出这是csv格式吗?您当然不能使用逗号作为字段分隔符,因为它们嵌入到文本中。祝你好运找到一个保证永远不会出现在文本中的。csv最适合用于结构众所周知的数据,因为这样可以智能地选择分隔符。这里的数据不是这样的。请使用Welcome to SO转置行。请花点时间阅读它所包含的链接。我是一个编码初学者。我甚至不知道怎么做。希望有一天我能成为像你一样的大师。
{
  "2017-09-28T04:06:55.411876": [
    "GOP proposes deep tax cuts, provides few details on how to pay for them",
    "Fact-checking Trump's claims from his speech on taxes",
    "College Student Says Car Engine Had No Oil, Hours After Getting An Oil Change"
  ],
  "2017-09-28T04:03:34.077658": [
    "GOP proposes deep tax cuts, provides few details on how to pay for them",
    "Fact-checking Trump's claims from his speech on taxes",
    "College Student Says Car Engine Had No Oil, Hours After Getting An Oil Change",
    "Benny Hinn Is My Uncle, but Prosperity Preaching Isn't for Me"
  ],
  "2017-09-28T04:01:59.304124": [
    "GOP proposes deep tax cuts, provides few details on how to pay for them",
    "Fact-checking Trump's claims from his speech on taxes",
    "Review: Apple Watch Series 3 with cellular further establishes an emerging computing platform"
    ]
}