How to Download Images from URLs Using Python

Table of Contents :

Downloading Images with Python: Requests, Urllib3, and Wget Guide

Quick answer: Use Python’s Requests library to download images by making a GET request and saving the binary content to a file. For multiple images, parse HTML with BeautifulSoup, extract image URLs, and download them in a loop.


Downloading a Single Image with Requests

Requests is the most popular and beginner-friendly library for HTTP requests.

python

import requests

# Image URL
url = 'https://books.toscrape.com/media/cache/2c/da/2cdad67c44b002e7ead0cc35693c0e8b.jpg'

# Download the image
response = requests.get(url)

# Save to file (binary mode)
with open('image.jpg', 'wb') as file:
    file.write(response.content)

Get filename from URL:

python

def extract_filename(url):
    return url.split("/")[-1]

with open(extract_filename(url), 'wb') as file:
    file.write(response.content)

Error Handling

Always handle potential errors:

python

import requests
from requests.exceptions import HTTPError, Timeout

url = 'https://example.com/image.jpg'

try:
    response = requests.get(url, timeout=10)
    response.raise_for_status()  # Raise exception for 4xx/5xx status codes
    
    with open('image.jpg', 'wb') as file:
        file.write(response.content)
        
except HTTPError as e:
    print(f"HTTP error: {e}")
except Timeout as e:
    print(f"Request timed out: {e}")
except IOError as e:
    print(f"File error: {e}")

Using Proxies with Requests

Avoid IP bans when downloading many images:

python

import requests

# Proxy configuration (from your provider)
proxies = {
    'http': 'http://username:password@proxy-host:port',
    'https': 'http://username:password@proxy-host:port'
}

url = 'https://books.toscrape.com/media/cache/2c/da/2cdad67c44b002e7ead0cc35693c0e8b.jpg'

response = requests.get(url, proxies=proxies)

with open('image.jpg', 'wb') as file:
    file.write(response.content)

Downloading with Urllib3

Urllib3 offers more control but requires more code:

python

import urllib3

# Basic download
url = 'https://books.toscrape.com/media/cache/2c/da/2cdad67c44b002e7ead0cc35693c0e8b.jpg'
response = urllib3.request('GET', url)

def extract_filename(url):
    return url.split("/")[-1]

with open(extract_filename(url), 'wb') as file:
    file.write(response.data)

With proxies (authenticated):

python

import urllib3

# Setup proxy authentication
headers = urllib3.make_headers(proxy_basic_auth='username:password')
http = urllib3.ProxyManager('http://proxy-host:port', proxy_headers=headers)

url = 'https://books.toscrape.com/media/cache/2c/da/2cdad67c44b002e7ead0cc35693c0e8b.jpg'
response = http.request('GET', url)

with open(extract_filename(url), 'wb') as file:
    file.write(response.data)

Downloading with Wget

Simplest option for quick downloads:

python

import wget

url = 'https://books.toscrape.com/media/cache/2c/da/2cdad67c44b002e7ead0cc35693c0e8b.jpg'
wget.download(url)  # Saves with filename from URL

Note: Wget Python library doesn’t support proxies or HTML parsing. For complex tasks, use Requests.


Downloading Multiple Images from a Website

Combine Requests with BeautifulSoup to scrape and download all images:

python

import requests
from bs4 import BeautifulSoup
from urllib.parse import urljoin

# Fetch the page
page_url = "https://books.toscrape.com/"
response = requests.get(page_url)
soup = BeautifulSoup(response.text, "html.parser")

# Find all image tags
img_tags = soup.find_all("img")

# Download each image
for img in img_tags:
    img_url = img.get("src")
    full_url = urljoin(page_url, img_url)  # Handle relative URLs
    
    # Download image
    img_response = requests.get(full_url)
    
    # Save with filename from URL
    filename = full_url.split("/")[-1]
    with open(filename, "wb") as file:
        file.write(img_response.content)
    
    print(f"Downloaded: {filename}")

Library Comparison

FeatureRequestsUrllib3Wget
Ease of useVery easyModerateExtremely easy
Proxy supportBuilt-inAdvanced optionsNone (Python lib)
Error handlingExcellentGoodBasic
Multiple imagesEasy with loopsModerateNeeds other libs
PerformanceGoodHighBasic
Best forMost projectsAdvanced usersQuick one-offs

Complete Example: Scraping All Images with Proxies

python

import requests
from bs4 import BeautifulSoup
from urllib.parse import urljoin
from requests.exceptions import HTTPError, Timeout

# Proxy setup
proxies = {
    'http': 'http://username:password@proxy-host:port',
    'https': 'http://username:password@proxy-host:port'
}

def download_image(img_url, folder="images/"):
    """Download single image with error handling"""
    try:
        response = requests.get(img_url, proxies=proxies, timeout=10)
        response.raise_for_status()
        
        filename = img_url.split("/")[-1]
        with open(folder + filename, "wb") as file:
            file.write(response.content)
        return True
    except (HTTPError, Timeout, IOError) as e:
        print(f"Failed to download {img_url}: {e}")
        return False

# Main scraping
page_url = "https://books.toscrape.com/"
response = requests.get(page_url, proxies=proxies)
soup = BeautifulSoup(response.text, "html.parser")

# Download all images
for img in soup.find_all("img"):
    img_url = urljoin(page_url, img.get("src"))
    download_image(img_url)

Key Takeaways

TaskRecommended Library
Single image downloadRequests or Wget
Multiple images with HTML parsingRequests + BeautifulSoup
High-performance scrapingUrllib3
Quick scriptsWget
Avoiding blocksRequests + rotating proxies
Production systemsRequests with error handling

Pro tip: Always add delays between requests and use proxies when downloading many images to avoid being blocked.

sonbahis girişsonbahissonbahis güncelgameofbetvdcasinomatbetgrandpashabetgrandpashabetエクスネスgiftcardmall/mygiftcasibomcasibom girişinterbahisinterbahis girişultrabetultrabet girişhiltonbethiltonbet girişenjoybetenjoybet giriştrendbettrendbet girişalobetalobet girişromabetromabet girişbetcio girişbetciokulisbetkulisbet girişbahiscasinobahiscasino girişroketbetroketbet girişnorabahisnorabahis girişbetzulabetzula girişbetgarbetgar girişultrabetultrabet girişteosbetteosbet girişeditörbeteditörbet girişorisbetorisbet girişceltabetceltabet girişenjoybetenjoybet girişalobetalobet girişromabetromabet girişbetciobetcio girişkulisbetkulisbetbahiscasinobahiscasino girişroketbetroketbet girişnorabahisnorabahis girişbetzulabetzula girişbetgarbetgar girişultrabetultrabet girişteosbetteosbet girişeditörbeteditörbet girişorisbetorisbet girişceltabetceltabet girişenjoybetenjoybet girişromabetromabet girişbetciobetcio girişbahiscasinobahiscasino girişroketbetroketbet girişnorabahisnorabahis girişbetzulabetzula girişbetgarbetgar girişultrabetultrabet girişeditörbeteditörbet girişorisbetorisbet girişceltabetceltabet girişenjoybetenjoybet girişalobetalobet girişkulisbetkulisbetteosbet girişteosbet girişromabetromabet girişbetciobetcio girişbahiscasino girişbahiscasinoroketbetroketbet girişnorabahisnorabahis girişbetzulabetzula girişbetgarbetgar girişultrabetultrabet girişeditörbeteditörbet girişorisbetorisbet girişceltabetceltabet girişenjoybetenjoybet girişalobetalobet girişkulisbetkulisbet girişteosbetteosbet girişbetcioalobetkulisbetbahiscasinobetgarnorabahisromabetatmbahisbetzulaultrabetjojobetjojobet güncel girişholiganbetholiganbet girişjojobetjojobet girişromabetromabet girişbetciobetcio girişroketbetroketbet girişnorabahisnorabahisbetzulabetzula girişbetgarbetgar girişultrabetultrabet girişeditörbeteditörbet girişorisbetorisbet girişceltabetceltabet girişenjoybetenjoybet girişalobetalobet girişkulisbetkulisbet girişteosbetteosbet girişbahiscasinobahiscasino girişbetciobetcio girişbahiscasinobahiscasino girişbahiscasinobahiscasinoalobetalobet girişromabetromabet girişromabetromabet girişromabetromabet girişroketbetroketbet girişbetciobetcio girişbahiscasinobahiscasino girişkulisbetkulisbet girişultrabetultrabet girişholiganbetholiganbet girişteosbetteosbetceltabetceltabet girişalobetalobet girişbetebetbetebetavvabetavvabetbelugabahisbelugabahisbetcupbetcupbetpasbetpasbetvolebetvoleelexbetelexbetimajbetimajbetperabetperabetinterbahisinterbahislidyabetlidyabetlimanbetlimanbetromabetromabet girişbetciobetcio girişroketbetroketbet girişbahiscasinobahiscasino girişkulisbetkulisbet girişultrabetultrabet girişholiganbetholiganbet girişteosbetteosbet girişceltabetceltabet girişalobetalobet girişngsbahisngsbahis girişrestbetrestbet giriştruvabettruvabet girişvenüsbetvenüsbet girişverabetverabet girişvevobahisvevobahis girişwinbirwinbir giriştulipbettulipbet girişmilanobetmilanobet girişklasbahisklasbahis girişgorabetgorabet girişbetperbetper giriştruvabetvenüsbetverabetvevobahiswinbirtulipbetmilanobetklasbahisgorabetbetperjokerbetjokerbet girişeditörbeteditörbet girişbetyapbetyap girişkingbettingkingbetting girişvipslotvipslot girişroketbetroketbet girişjojobetjojobet girişavvabetavvabet girişbelugabahisbelugabahis girişbetcupbetcup girişbetebetbetebet girişbetpasbetpas girişbetvolebetvole girişelexbetelexbet girişimajbetimajbet girişperabetperabet girişinterbahisinterbahis girişmeritkingmeritking girişlidyabetlidyabet girişlimanbetlimanbet giriş