Skip to content Skip to sidebar Skip to footer

Loading More Content In A Webpage And Issues Writing To A File

I am working on a web scraping project which involves scraping URLs from a website based on a search term, storing them in a CSV file(under a single column) and finally scraping th

Solution 1:

Regarding your second problem, your mode is off. You'll need to convert w+ to a+. In addition, your indentation is off.

withopen('ctp_output.csv', 'rb') as f1:
    f1.seek(0)
    reader = csv.reader(f1)

    for line in reader:
        url = line[0]       
        soup = BeautifulSoup(urllib2.urlopen(url))

        withopen('ctp_output.txt', 'a+') as f2:
            for tag in soup.find_all('p'):
                f2.write(tag.text.encode('utf-8') + '\n')

The + suffix will create the file if it doesn't exist. However, w+ will erase all contents before writing at each iteration. a+ on the other hand will append to a file if it exists, or create it if it does not.

For your first problem, there's no option but to switch to something that can automate clicking browser buttons and whatnot. You'd have to look at selenium. The alternative is to manually search for that button, extract the url from the href or text, and then make a second request. I leave that to you.

Solution 2:

If there are more pages with results observe what changes in the URL when you manually click to go to the next page of results. I can guarantee 100% that a small piece of the URL will have eighter a subpage number or some other variable encoded in it that strictly relates to the subpage. Once you have figured out what is the combination you just fit that into a for loop where you put a .format() into the URL that you want to scrape and keep navigating this way through all the subpages of the results.

As to what is the last subpage number - you have to inspect the html code of the site you are scraping and find the variable responsible for it and extract its value. See if there is "class": "Page" or equivalent in their code - it may contain that number that you will need for your for loop.

Unfortunately there is no magic navigate through subresults option.... But this gets pretty close :).

Good luck.

Post a Comment for "Loading More Content In A Webpage And Issues Writing To A File"