Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/selenium/4.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
UnicodeEncodeError:&x27;charmap';编解码器可以';t编码字符'\u20b9';位置0:python_Python_Selenium_Selenium Webdriver - Fatal编程技术网

UnicodeEncodeError:&x27;charmap';编解码器可以';t编码字符'\u20b9';位置0:python

UnicodeEncodeError:&x27;charmap';编解码器可以';t编码字符'\u20b9';位置0:python,python,selenium,selenium-webdriver,Python,Selenium,Selenium Webdriver,我正在废弃Flipkart数据。数据看起来像₹40,999. 如何获取这些数据 这些代码就是问题所在: 获取以下错误: 这有帮助吗??? for post2 in driver.find_elements_by_css_selector("._1vC4OE._2rQ-NK"): offer_price += [ post2.text ] #print(offer_price) print(offer_price) UnicodeEncodeError: 'charmap

我正在废弃Flipkart数据。数据看起来像₹40,999. 如何获取这些数据

这些代码就是问题所在:

获取以下错误:

这有帮助吗???
for post2 in driver.find_elements_by_css_selector("._1vC4OE._2rQ-NK"):
    offer_price += [ post2.text ]
    #print(offer_price)
    print(offer_price)
UnicodeEncodeError: 'charmap' codec can't encode character '\u20b9' in position 0: character maps to <undefined>
full code:
from selenium import webdriver
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.common.exceptions import TimeoutException
from selenium.webdriver.common.by import By
import time
import csv
chrome_path = r"C:\Users\Venkatesh\AppData\Local\Programs\Python\Python35\chromedriver.exe"
driver = webdriver.Chrome(chrome_path)
RegionIDArray = ["https://www.flipkart.com/mobiles/pr?otracker=categorytree&page=1&sid=tyy%2C4io", "https://www.flipkart.com/mobiles/pr?otracker=categorytree&page=2&sid=tyy%2C4io"]
mobile_link = []
mobile_name = []
offer_price = []
data_list=[]
delay = 10 # seconds
for reg in RegionIDArray:
    try:
        driver.get(reg)
        WebDriverWait(driver, delay).until(EC.presence_of_element_located((By.XPATH, "//*[@id='container']/div/div[2]/div[2]/div/div[2]/div/div[3]/div[1]/div/div[1]")))
        driver.execute_script("window.scrollTo(0, document.body.scrollHeight);")
        print("Page is ready")

        for post in driver.find_elements_by_class_name("_1UoZlX"):
            mobile_link += [ post.get_attribute("href") ]

        for post1 in driver.find_elements_by_class_name("_3wU53n"):
            mobile_name += [ post1.text ]

        for post2 in driver.find_elements_by_css_selector("._1vC4OE._2rQ-NK"):
            offer_price += [ post2.text ]
            #print(offer_price)
            print(offer_price)