How To Get Python To Successfully Download Large Images From The Internet
So I've been using urllib.request.urlretrieve(URL, FILENAME) to download images of the internet. It works great, but fails on some images. The ones it fails on seem to be the larg
Solution 1:
In the past, I have used this code for copying from the internet. I have had no trouble with large files.
defdownload(url):
file_name = raw_input("Name: ")
u = urllib2.urlopen(url)
f = open(file_name, 'wb')
meta = u.info()
file_size = int(meta.getheaders("Content-Length")[0])
print"Downloading: %s Bytes: %s" % (file_name, file_size)
file_size_dl = 0
block_size = 8192whileTrue:
buffer = u.read(block_size)
ifnot buffer:
break
Solution 2:
Here's the sample code for Python 3 (tested in Windows 7):
import urllib.request
def download_very_big_image():
url = 'http://i.imgur.com/DEKdmba.jpg'
filename = 'C://big_image.jpg'
conn = urllib.request.urlopen(url)
output = open(filename, 'wb') #binary flag needed for Windows
output.write(conn.read())
output.close()
For completeness sake, here's the equivalent code in Python 2:
import urllib2
def download_very_big_image():
url = 'http://i.imgur.com/DEKdmba.jpg'
filename = 'C://big_image.jpg'
conn = urllib2.urlopen(url)
output = open(filename, 'wb') #binary flag needed for Windows
output.write(conn.read())
output.close()
Solution 3:
This should work: use requests
module:
import requests
img_url = 'http://i.imgur.com/DEKdmba.jpg'
img_name = img_url.split('/')[-1]
img_data = requests.get(img_url).content
with open(img_name, 'wb') as handler:
handler.write(img_data)
Post a Comment for "How To Get Python To Successfully Download Large Images From The Internet"