Selenium Load Time Errors - Looking For Possible Workaround
I am trying to data scrape from a certain website. I am using Selenium so that I can log myself in, and then start parsing through data. I have 3 main errors: Last page # not loa
Solution 1:
You should have explicit waits in your code to handle the dynamic loading of the pages. Sorting the page by "Newest First" causes it to refresh the results and introduces a spinner to indicate the sorting.
<iclass="fa fa-spinner fa-spin"aria-hidden="true"style="font-size: 48px;"></i>
Waiting for the spinner to disappear should give you the correct page count. Something on the following lines:
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
.....
# your login code
.....
driver.find_element_by_link_text("Newest First").click()
element = WebDriverWait(driver, 10).until(
EC.invisibility_of_element_located((By.XPATH, "//i[@class='fa fa-spinner fa-spin']"))
)
last_page = driver.find_element_by_class_name("right-center").text
To find all the brand names listed on the page, you need to find all the span
tags with class='brand-name'
by calling the method find_elements_by_xpath(plural, elements)
brand_names_list = driver.find_elements_by_xpath("//span[@class='brand-name']")
for brand_name in brand_name_list:
print brand_name.text
Post a Comment for "Selenium Load Time Errors - Looking For Possible Workaround"