Skip to content Skip to sidebar Skip to footer

Download Files From A Website With Python

I have about 300 small files that I need to download from a website. All are located in one directory. The files are of different sizes and have different extensions. I don't want

Solution 1:

This is all detailed here. I would favor using Requests as it's generally great, but urllib2 is in the standard library so doesn't require the installation of a new package.


Solution 2:

If you're on python 3.3, you're looking for urllib:

import urllib.request
url = r"https://www.google.com/images/srpr/logo4w.png"
opener = urllib.request.urlopen(url)
file_out = open("my_image.png", "wb")
file_out.write(opener.readall())
file_out.close()

You should now have a file in your working directory called "my_image.png"


Post a Comment for "Download Files From A Website With Python"