0% found this document useful (0 votes)
32 views10 pages

Image Compression

Image compression is crucial for managing the vast amount of data generated by images, which constitute 80% of internet traffic. It significantly reduces file sizes, enhances transmission speeds, and saves bandwidth, thereby improving user experience and storage efficiency. The document discusses various compression techniques, including lossless and lossy methods, and highlights the evolution of standards like JPEG and newer formats such as JPEG 2000 and HEIF.

Uploaded by

archuvashi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views10 pages

Image Compression

Image compression is crucial for managing the vast amount of data generated by images, which constitute 80% of internet traffic. It significantly reduces file sizes, enhances transmission speeds, and saves bandwidth, thereby improving user experience and storage efficiency. The document discusses various compression techniques, including lossless and lossy methods, and highlights the evolution of standards like JPEG and newer formats such as JPEG 2000 and HEIF.

Uploaded by

archuvashi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Image Compression

Mrs ARCHANA R

ASSISTANT PROFESSOR DEPARTMENT OF COMPUTER SCIENCE,

SRI ADI CHUNCHANAGIRI WOMENS COLLEGE, CUMBUM.


Why Image Compression Matters
Images account for an astonishing 80% of daily internet traffic, making their efficient handling paramount. An uncompressed high-definition
photograph can easily consume around 6 megabytes of data, a significant footprint in terms of storage and transmission. This highlights why
image compression is not just a technical detail, but a vital necessity for the digital world, influencing everything from storage costs to website
loading speeds and overall user experience.

Storage Efficiency Faster Transmission


Image compression dramatically reduces file sizes, with typical Optimized images lead to web pages loading up to 50% faster,
JPEGs seeing reductions of 75-90%. This frees up valuable storage improving user experience and SEO rankings. This is crucial for
space on devices and servers. today's fast-paced digital environment.

Bandwidth Savings Scalability


Reduced file sizes translate directly to lower bandwidth Efficient compression enables the handling and distribution of
consumption, which is critical for mobile users, streaming services, vast amounts of visual data globally. Without it, the sheer volume
and data-intensive applications, saving costs for both users and of images would overwhelm current digital infrastructures.
providers.
Fundamental Concepts: Redundancy & Compression Types
Redundancy
Image compression primarily works by exploiting redundancy within
the data. This includes spatial redundancy (similar adjacent pixels),
spectral redundancy (correlation between color channels), and
temporal redundancy (similarities between frames in a video
sequence).

Error-Free (Lossless) Compression


This method allows for perfect reconstruction of the original image
data without any loss of information. It typically achieves compression
ratios of 2:1 to 5:1. It's essential for applications where data integrity is
paramount, such as medical imaging, legal documents, and long-
term archiving.

Lossy Compression
In contrast, lossy compression involves discarding perceptually
insignificant information. While it results in irreversible data loss, the
aim is for this loss to be imperceptible to the human eye. This method
achieves much higher compression ratios, often 10:1 to 100:1 or more,
making it the dominant choice for web photos, social media, and
everyday digital images.
Error-Free Compression I: Variable Length Coding
Concept Huffman Coding Run-Length Encoding (RLE)
Variable Length Coding (VLC) assigns A widely adopted algorithm, Huffman RLE is particularly effective for
shorter binary codes to data values that Coding is an optimal prefix coding compressing sequences of identical
appear more frequently and longer technique used as an entropy encoder. data values, common in images with
codes to those that appear less often. For instance, a common character like large uniform areas. For example, a
This minimizes the total number of bits 'E' might be encoded as '01', while a rare sequence of five 'A's followed by two 'B's
needed to represent the data. one like 'Q' could be '1101'. This can and one 'C' ("AAAAABBC") is
improve compression by 20-30% over compressed to "5A2B1C". This method is
fixed-length coding. often used for fax images and older
bitmap formats.
Error-Free Compression II: Bit Plane & Predictive Coding
Bit Plane Coding
This technique involves decomposing an image into multiple binary
bit planes. For an 8-bit grayscale image, there are eight bit planes,
each representing a specific bit position (from least significant to
most significant). Each plane, being binary, can then be compressed
independently using highly efficient binary compression methods.
This approach facilitates progressive transmission, where a rough
image appears quickly and refines as more planes arrive, and Region-
of-Interest (ROI) coding, where specific areas can be transmitted with
higher fidelity. It finds applications in specialized fields like medical
imaging, particularly with the DICOM format.

Lossless Predictive Coding

This method predicts the value of a pixel based on its neighboring pixels and then encodes only the difference (prediction error) between the
actual and predicted values. Since these errors are typically much smaller and have a narrower range than the original pixel values, they can be
compressed more efficiently. Differential Pulse Code Modulation (DPCM) is a common implementation, utilizing simple predictors like the
previous pixel's value or the average of adjacent pixels. This technique can improve compression efficiency by 10-20% compared to non-
predictive lossless methods by leveraging spatial correlations within the image.
Lossy Compression: The Core -
Quantization
Concept
The fundamental principle of lossy compression lies in strategically discarding
information that is perceptually insignificant to the human visual system. This
means removing data that the eye is unlikely to notice, allowing for much
greater compression ratios.

Quantization
Quantization is the primary mechanism for data reduction in lossy
compression. It involves mapping a continuous range of input values to a much
smaller set of discrete output values. For example, reducing 256 distinct shades
of blue to just 16 perceptually similar shades. This irreversible process is where
the "lossy" aspect originates.

Psycho-visual Redundancy
Quantization exploits the psycho-visual redundancy of the human visual
system. Our eyes are more sensitive to changes in brightness than in color, and
less sensitive to high-frequency details. Lossy compression algorithms
strategically remove these less perceptible details.

Trade-off
A critical trade-off exists between the compression ratio and potential visual
artifacts. Higher compression (more data discarded) leads to smaller file sizes
but can introduce noticeable visual degradations like blockiness, blurring, or
color banding. The goal is to find the optimal balance for the desired
application.
Lossy Predictive Coding
Concept
Lossy predictive coding combines the principles of prediction, similar
to those used in lossless predictive methods like DPCM, with the
crucial step of quantizing the *prediction error*. Instead of encoding
the raw pixel values, the system predicts what a pixel's value should
be based on its surrounding pixels.

Process
A pixel value is predicted from its already encoded neighbors.
The actual pixel value is compared to the predicted value,
generating a prediction error.
This prediction error is then quantized, meaning some of its less
significant information is discarded.
Finally, the quantized error is encoded and stored or transmitted.

This technique significantly improves compression efficiency over simple quantization by leveraging the strong spatial correlation present in
images. By encoding only the small, residual errors after prediction, more bits can be saved. Lossy predictive coding is widely applied in video
compression, such as inter-frame prediction in MPEG standards, where it drastically reduces the data needed to represent sequential frames by
predicting motion and only encoding the differences. It can also offer better quality at lower bitrates for certain image types compared to pure
transform coding methods, especially for content with strong spatial predictability.
Compression Standards I: JPEG
Joint Photographic Experts Group (JPEG)
Introduced in 1992, JPEG remains the most prevalent image
compression standard globally. Its widespread adoption is due to its
excellent balance of compression ratio and visual quality for
photographic images.

Algorithm
JPEG's core algorithm involves breaking an image into 8x8 pixel
blocks and applying the Discrete Cosine Transform (DCT) to convert
spatial information into frequency components. These frequency
components are then subjected to a crucial lossy step: quantization,
where less important frequency data is discarded. Finally, entropy
coding (like Huffman coding and Run-Length Encoding) is used to
compress the quantized data.

Ubiquity & Performance


JPEG images account for approximately 70% of all images found on
the internet. It typically achieves impressive compression ratios
ranging from 10:1 to 50:1, effectively reducing an average 5MB photo
to a compact 200KB-500KB file, making it ideal for web and general-
purpose use.
Compression Standards II: JPEG
2000 & Modern Alternatives

JPEG 2000 (JP2) WebP (Google)


Standardized in 2000, JP2 was designed Introduced by Google in 2010, WebP
to overcome JPEG's limitations. It uses offers significantly smaller file sizes (25-
Wavelet Transform, offering superior 34% smaller than JPEG for equivalent
quality at low bitrates and supporting quality) and supports both lossless and
both lossless and lossy compression lossy compression, as well as animation
within a single file. Its scalability features and alpha channel transparency. It's
allow for progressive decoding by rapidly gaining traction, particularly for
resolution or quality. It's used in digital web content, due to its efficiency in
cinema (DCP), medical imaging improving page load times.
(DICOM), and for archival purposes
where higher quality and flexibility are
needed.

HEIF (Apple)
High Efficiency Image File Format
(HEIF), launched in 2017, is gaining
prominence through its adoption by
Apple for photos and Live Photos. HEIF
boasts up to double the compression
efficiency of JPEG at the same quality,
supporting multiple images in one file,
image sequences, and richer metadata.
It's a versatile container format designed
for modern photography and media.
Conclusion: The Evolving Landscape of Image Compression
Image compression is an indispensable cornerstone of the digital age, enabling the seamless balance between visual quality and data efficiency.
The journey from early, basic algorithms to sophisticated modern standards has been driven by continuous innovation in mathematical models
and computational techniques, constantly pushing the boundaries of what's possible in terms of performance and visual fidelity.

Looking ahead, the field of image compression is poised for even more transformative changes. Future trends include the integration of AI-
driven compression algorithms that can intelligently optimize image data based on content and perceptual metrics, specialized codecs tailored
for emerging media formats like augmented reality (AR) and virtual reality (VR), and increasingly efficient, versatile formats that adapt to diverse
viewing environments. These ongoing advancements are not merely technical improvements; they are foundational to underpinning fluid, high-
quality digital experiences across every platform, ensuring that our visually-driven world remains accessible and efficient.

You might also like