OPERATING SYSTEM
What is an Operating System?
Last Updated : 11 Mar, 2025
An Operating System is a System software that manages all the
resources of the computing device.
Acts as an interface between the software and different parts of the
computer or the computer hardware.
Manages the overall resources and operations of the computer.
Controls and monitors the execution of all other programs that reside in
the computer, which also includes application programs and other
system software of the computer.
Examples of Operating Systems are Windows, Linux, macOS,
What is an Operating System Used for?
As a platform for Application programs: It provides a platform, on top
of which, other programs, called application programs can run.
Managing Input-Output unit: It also allows the computer to manage its
own resources such as memory, monitor, keyboard, printer, etc.
Management of these resources is required for effective and fair
utilization.
Multitasking: It manages memory and allows multiple programs to run
in their own space and even communicate with each other through
shared memory.
Manages memory and Files: It manages the computer’s main memory
and second storage. Additionally, it allows and deallocates memory to
all tasks and applications.
Provides Security: It helps to maintain the system and applications
safe through the authorization process. Thus, the OS provides security
to the system.
For more, refer to Need of Operating Systems.
Functions of the Operating System
Resource Management: The operating system manages and allocates
memory, CPU time, and other hardware resources among the various
programs and processes running on the computer.
Process Management: The operating system is responsible for
starting, stopping, and managing processes and programs. It also
controls the scheduling of processes and allocates resources to them.
Memory Management: The operating system manages the computer's
primary memory and provides mechanisms for optimizing memory
usage.
Security: The operating system provides a secure environment for the
user, applications, and data by implementing security policies and
mechanisms such as access controls and encryption.
Job Accounting: It keeps track of time and resources used by various
jobs or users.
File Management: The operating system is responsible for organizing
and managing the file system, including the creation, deletion, and
manipulation of files and directories.
History of Operating System
An operating system is a type of software that acts as an interface between
the user and the hardware. It is responsible for handling various critical
functions of the computer and utilizing resources very efficiently so the
operating system is also known as a resource manager. The operating
system also acts like a government because just as the government has
authority over everything, similarly the operating system has authority over
all resources. Various tasks that are handled by OS are file management,
task management, garbage management, memory management, process
management, disk management, I/O management, peripherals
management, etc.
Generations of Operating Systems
1940s-1950s: Early Beginnings
o Computers operated without operating systems (OS).
o Programs were manually loaded and run, one at a time.
o The first operating system was introduced in 1956. It was
a batch processing system GM-NAA I/O (1956) that automated
job handling.
1960s: Multiprogramming and Timesharing
o Introduction of multiprogramming to utilize CPU efficiently.
o Timesharing systems, like CTSS (1961) and Multics (1969),
allowed multiple users to interact with a single system.
1970s: Unix and Personal Computers
o Unix (1971) revolutionized OS design with simplicity, portability,
and multitasking.
o Personal computers emerged, leading to simpler OSs like
CP/M (1974) and PC-DOS (1981).
1980s: GUI and Networking
o Graphical User Interfaces (GUIs) gained popularity with
systems like Apple Macintosh (1984) and Microsoft Windows
(1985).
o Networking features, like TCP/IP in Unix, became essential.
1990s: Linux and Advanced GUIs
o Linux (1991) introduced open-source development.
o Windows and Mac OS refined GUIs and gained widespread
adoption.
2000s-Present: Mobility and Cloud
o Mobile OSs like iOS (2007) and Android (2008) dominate.
o Cloud-based and virtualization technologies reshape
computing, with OSs like Windows Server and Linux driving
innovation.
AI Integration - (Ongoing)
With the growth of time, Artificial intelligence came into picture.
Operating system integrates features of AI technology like Siri, Google
Assistant, and Alexa and became more powerful and efficient in many
way. These AI features with operating system create a entire new
feature like voice commands, predictive text, and personalized
recommendations.
Advantages of Operating System
Operating System manages external and internal devices for example,
printers, scanners, and other.
Operating System provides interfaces and drivers for proper
communication between system and hardware devices.
Allows multiple applications to run simultaneously.
Manages the execution of processes, ensuring that the system remains
responsive.
Organizes and manages files on storage devices.
Operating system allocates resources to various applications and
ensures their efficient utilization.
Disadvantages of Operating System
If an error occurred in your operating system, then there may be a
chance that your data may not be recovered therefore always have a
backup of your data.
Threats and viruses can attack our operating system at any time,
making it challenging for the OS to keep the system protected from
these dangers.
For learning about new operating system can be a time-consuming and
challenging, Specially for those who using particular Operating system
for example switching from Windows OS to Linux is difficult.
Keeping an operating system up-to-date requires regular maintenance,
which can be time-consuming.
Operating systems consume system resources, including CPU,
memory, and storage, which can affect the performance of other
applications.
Introduction to Parallel Computing
Last Updated : 04 Jun, 2021
Before taking a toll on Parallel Computing, first, let's take a look at the
background of computations of computer software and why it failed for the
modern era.
Computer software was written conventionally for serial computing. This
meant that to solve a problem, an algorithm divides the problem into
smaller instructions. These discrete instructions are then executed on the
Central Processing Unit of a computer one by one. Only after one
instruction is finished, next one starts.
A real-life example of this would be people standing in a queue waiting for
a movie ticket and there is only a cashier. The cashier is giving tickets one
by one to the persons. The complexity of this situation increases when
there are 2 queues and only one cashier.
So, in short, Serial Computing is following:
1. In this, a problem statement is broken into discrete instructions.
2. Then the instructions are executed one by one.
3. Only one instruction is executed at any moment of time.
Look at point 3. This was causing a huge problem in the computing
industry as only one instruction was getting executed at any moment of
time. This was a huge waste of hardware resources as only one part of
the hardware will be running for particular instruction and of time. As
problem statements were getting heavier and bulkier, so does the amount
of time in execution of those statements. Examples of processors are
Pentium 3 and Pentium 4.
Now let's come back to our real-life problem. We could definitely say that
complexity will decrease when there are 2 queues and 2 cashiers giving
tickets to 2 persons simultaneously. This is an example of Parallel
Computing.
Parallel Computing :
It is the use of multiple processing elements simultaneously for solving
any problem. Problems are broken down into instructions and are solved
concurrently as each resource that has been applied to work is working at
the same time.
Advantages of Parallel Computing over Serial Computing are as follows:
1. It saves time and money as many resources working together will
reduce the time and cut potential costs.
2. It can be impractical to solve larger problems on Serial Computing.
3. It can take advantage of non-local resources when the local resources
are finite.
4. Serial Computing 'wastes' the potential computing power, thus Parallel
Computing makes better work of the hardware.
Types of Parallelism:
1. Bit-level parallelism -
It is the form of parallel computing which is based on the increasing
processor's size. It reduces the number of instructions that the system
must execute in order to perform a task on large-sized data.
Example: Consider a scenario where an 8-bit processor must compute
the sum of two 16-bit integers. It must first sum up the 8 lower-order
bits, then add the 8 higher-order bits, thus requiring two instructions to
perform the operation. A 16-bit processor can perform the operation
with just one instruction.
2. Instruction-level parallelism -
A processor can only address less than one instruction for each clock
cycle phase. These instructions can be re-ordered and grouped which
are later on executed concurrently without affecting the result of the
program. This is called instruction-level parallelism.
3. Task Parallelism -
Task parallelism employs the decomposition of a task into subtasks
and then allocating each of the subtasks for execution. The processors
perform the execution of sub-tasks concurrently.
4. Data-level parallelism (DLP) -
Instructions from a single stream operate concurrently on several
data – Limited by non-regular data manipulation patterns and by
memory bandwith.
Applications of Parallel Computing:
Databases and Data mining.
Real-time simulation of systems.
Science and Engineering.
Advanced graphics, augmented reality, and virtual reality.