Python 在基于ajax的数据加载到ChromeWebDriver上之后,我如何才能获取它的最新内容

Python 在基于ajax的数据加载到ChromeWebDriver上之后,我如何才能获取它的最新内容,python,python-3.x,selenium,selenium-webdriver,selenium-chromedriver,Python,Python 3.x,Selenium,Selenium Webdriver,Selenium Chromedriver,我在chrome webdriver上面临一个问题,selenium没有给我更新的内容,它显示我以前的内容,但实际上在单击下一页链接后,新数据附加到浏览器中,但当我通过驱动程序时,它给我以前的内容 网站链接是:www.abc.com 我的目标是提取所有作业链接。但是我做不到,请在这方面帮助我 job_links = [] per_page = 9 total_jobs = int(driver.find_element_by_css_selector(".search-results-count

我在chrome webdriver上面临一个问题,selenium没有给我更新的内容,它显示我以前的内容,但实际上在单击下一页链接后,新数据附加到浏览器中,但当我通过驱动程序时,它给我以前的内容

网站链接是:
www.abc.com

我的目标是提取所有作业链接。但是我做不到,请在这方面帮助我

job_links = []
per_page = 9
total_jobs = int(driver.find_element_by_css_selector(".search-results-count.total-jobs").text.split("(")[1].split(")")[0])
total_pages = math.ceil(total_jobs / per_page)

for x in range(1, total_pages):
    print("Page number: ", x)
    jobs_on_page = ""
    time.sleep(5)
    jobs_on_page = driver.find_elements_by_xpath("//div[@class='module job-card-wrapper col-md-4 col-xs-12 col-sm-6 corporate-regular background-white']")
    for job in jobs_on_page:
        print("job is:", job)
        job_link = job.find_element_by_xpath("./a").get_attribute('href').split("%")[0]
        job_links.append(job_link)
    # if x != (total_pages - 1):
    print("Hello Page: ", x)
    element = driver.find_element_by_xpath(
        "//div[@class='reinvent-pagination-next']//span[@class='arrow cta-arrow']")
    webdriver.ActionChains(driver).move_to_element(element).click(element).perform()
    # self.wait.until(EC.element_to_be_clickable((By.XPATH, "//div[@class='reinvent-pagination-next']//span[@class='arrow cta-arrow']"))).click()
    time.sleep(10)"
它让我重复第一页的工作链接,但我的网页在webdriver中的变化

incluse
WebDriverWait()
visibility\u所有元素位于
()并遵循css选择器获取所有链接

使用无限while循环并使用try..except检查下一步按钮是否可用

代码

from selenium import webdriver
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from selenium.webdriver.chrome.options import Options

options = Options()
options.add_argument("start-maximized")
options.add_experimental_option("excludeSwitches", ["enable-automation"])
options.add_experimental_option('useAutomationExtension', False)
driver = webdriver.Chrome(options=options)
driver.get("https://www.boom.com")

Alllinks=[]
while True:

      elements=WebDriverWait(driver, 20).until(EC.visibility_of_all_elements_located((By.CSS_SELECTOR, "div.module > a[data-linkcomponentname='jobsearchblock']")))
      for link in elements:
         Alllinks.append(link.get_attribute('href'))
      try :
           next_btn=WebDriverWait(driver,10).until(EC.element_to_be_clickable((By.XPATH,'//a[@class="next-page-btn"]')))
           driver.execute_script("arguments[0].click();", next_btn)
      except:
           break
      time.sleep(1)
print('Total links :' + str(len(Alllinks)))
print(Alllinks)
输出

from selenium import webdriver
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from selenium.webdriver.chrome.options import Options

options = Options()
options.add_argument("start-maximized")
options.add_experimental_option("excludeSwitches", ["enable-automation"])
options.add_experimental_option('useAutomationExtension', False)
driver = webdriver.Chrome(options=options)
driver.get("https://www.boom.com")

Alllinks=[]
while True:

      elements=WebDriverWait(driver, 20).until(EC.visibility_of_all_elements_located((By.CSS_SELECTOR, "div.module > a[data-linkcomponentname='jobsearchblock']")))
      for link in elements:
         Alllinks.append(link.get_attribute('href'))
      try :
           next_btn=WebDriverWait(driver,10).until(EC.element_to_be_clickable((By.XPATH,'//a[@class="next-page-btn"]')))
           driver.execute_script("arguments[0].click();", next_btn)
      except:
           break
      time.sleep(1)
print('Total links :' + str(len(Alllinks)))
print(Alllinks)
链接总数:90