Unit I - Introduction
Unit I - Introduction
Cloud Computing
Unit I - Introduction
1. Cloud - Definition Cloud computing is a model that enables ubiquitous, convenient, on-
demand network access to a shared pool of configurable computing resources (e.g., networks,
servers, storage, applications, and services). These resources can be rapidly provisioned and
released with minimal management effort or service provider interaction. It offers a shift from
traditional on-premise computing to a flexible service-based model where resources are
delivered over the internet, offering cost savings, scalability, and improved performance.
Virtual machines also make more efficient use of the hardware hosting them. By running many
virtual machines at once, one server can run many virtual "servers," and a data center becomes
like a whole host of data centers, able to serve many organizations.
1. Cost Efficiency
2. Scalability
o Ideal for businesses with fluctuating workloads (e.g., e-commerce sites during
sales).
4. Reliability
5. Performance
7. Security
8. Environmental Sustainability
9. Improved Collaboration
o Cloud ensures that your business data is always accessible—even during local
disasters (fire, flood, theft).
3. Usage Scenarios Cloud computing is used in various domains. The following are the few
real-world applications of cloud computing.
1. Online Data Storage: Cloud computing allows storing data like files, images, audios,
and videos, etc on the cloud storage. The organization need not set physical storage
systems to store a huge volume of business data which costs so high nowadays. As they
are growing technologically, data generation is also growing with respect to time and
storing that becoming problem. In that situation, Cloud storage is providing this service
to store and access data any time as per requirement.
2. Backup and Recovery: Cloud vendors provide security from their side by storing safe
to the data as well as providing a backup facility to the data. They offer various recovery
application for retrieving the lost data. In the traditional way backup of data is a very
complex problem, and it is very difficult sometimes impossible to recover the lost data.
But cloud computing has made backup and recovery applications very easy where there
is no fear of running out of backup media or loss of data.
3. Bigdata Analysis : We know the volume of big data is so high where storing that in
traditional data management system for an organization is impossible. But cloud
computing has resolved that problem by allowing the organizations to store their large
volume of data in cloud storage without worrying about physical storage. Next comes
analyzing the raw data and finding out insights or useful information from it is a big
challenge as it requires high-quality tools for data analytics. Cloud computing provides
the biggest facility to organizations in terms of storing and analyzing big data.
4. Testing and development: Setting up the platform for development and finally
performing different types of testing to check the readiness of the product before
delivery requires different types of IT resources and infrastructure. But Cloud
computing provides the easiest approach for development as well as testing even if
deployment by using their IT resources with minimal expenses. Organizations find it
more helpful as they got scalable and flexible cloud services for product development,
testing, and deployment.
9. Cloud Computing in Medical Fields: In the medical field also nowadays cloud
computing is used for storing and accessing the data as it allows to store data and access
it through the internet without worrying about any physical setup. It facilitates easier
access and distribution of information among the various medical professional and the
individual patients. Similarly, with help of cloud computing offsite buildings and
treatment facilities like labs, doctors making emergency house calls and ambulances
information, etc can be easily accessed and updated remotely instead of having to wait
until they can access a hospital computer.
10. Entertainment Applications: Many people get entertainment from the internet, in that
case, cloud computing is the perfect place for reaching to a varied consumer base.
Therefore, different types of entertainment industries reach near the target audience by
adopting a multi-cloud strategy. Cloud-based entertainment provides various
entertainment applications such as online music/video, online games and video
conferencing, streaming services, etc and it can reach any device be it TV, mobile, set-
top box, or any other form. It is a new form of entertainment called On-Demand
Entertainment (ODE).
With respect to this as a cloud, the market is growing rapidly, and it is providing various
services day by day. So other application of cloud computing includes social applications,
management application, business applications, art application, and many more. So, in the
future cloud computing is going to touch many more sectors by providing more applications
and services.
4. History of Cloud Computing Cloud computing evolved from the concepts of grid
computing and utility computing. In the 1960s, John McCarthy suggested computing could be
organized as a public utility. Salesforce introduced the first SaaS product in 1999. Amazon
launched AWS in 2006, marking a major milestone by offering IT infrastructure services.
Google, Microsoft, and IBM soon followed, shaping the modern cloud era. The development
of virtualization technologies further accelerated cloud adoption.
In 1963, DARPA (the Defense Advanced Research Projects Agency) presented MIT with $2
million for Project MAC. The funding included a requirement for MIT to develop technology
allowing for a “computer to be used by two or more people, simultaneously.” In this case, one
of those gigantic, archaic computers using reels of magnetic tape for memory became the
precursor to what has now become collectively known as cloud computing. It acted as a
primitive cloud with two or three people accessing it. The word “virtualization” was used to
describe this situation, though the word’s meaning later expanded.
In 1969, J. C. R. Licklider helped develop the ARPANET (Advanced Research Projects Agency
Network), a “very” primitive version of the Internet. JCR, or “Lick,” was both a psychologist
and a computer scientist, and promoted a vision called the “Intergalactic Computer Network,”
in which everyone on the planet would be interconnected by way of computers and able to
access information from anywhere. (What could such an unrealistic, impossible-to-pay-for
fantasy of the future look like?) The Intergalactic Computer Network, otherwise known as the
internet, is necessary for access to the cloud.
The meaning of virtualization began shifting in the 1970s and now describes the creation of a
virtual machine, which acts like a real computer with a fully functional operating system. The
concept of virtualization has evolved with the internet, as businesses began offering “virtual”
private networks as a rentable service. The use of virtual computers became popular in the
1990s, leading to the development of the modern cloud computing infrastructure.
The cloud gained popularity as companies gained a better understanding of its services and
usefulness. In 1999, Salesforce became a popular example of using cloud computing
successfully. They used it to pioneer the idea of using the Internet to deliver software programs
to the end users. The program (or application) could be accessed and downloaded by anyone
with Internet access. Businesses could purchase the software in an on-demand, cost-effective
manner without leaving the office.
In 2002, Amazon introduced its web-based retail services. It was the first major business to
think of using only 10% of its capacity (which was commonplace at the time) as a problem to
be solved. The cloud computing infrastructure model allowed them to use their computer’s
capacity more efficiently. Soon after, other large organizations followed their example.
In 2006, Amazon launched Amazon Web Services, which offers online services to other
websites or clients. One of Amazon Web Services’ sites, called Amazon Mechanical Turk,
provides a variety of cloud-based services, including storage, computation, and “human
intelligence.” Another of Amazon Web Services’ sites is the Elastic Compute Cloud (EC2),
allowing individuals to rent virtual computers and use their own programs and applications.
In the same year, Google launched Google Docs services. Google Docs was originally based
on two separate products, Google Spreadsheets and Writely. Google purchased Writely, which
allows renters to save documents, edit documents, and transfer them into blogging systems.
(These documents are compatible with Microsoft Word.) Google Spreadsheets (acquired from
2Web Technologies in 2005) is an Internet-based program allowing users to develop, update,
and edit spreadsheets and to share data online. An Ajax-based program is used, which is
compatible with Microsoft Excel. The spreadsheets can be saved in an HTML format.
In 2007, IBM, Google, and several universities joined forces to develop a server farm for
research projects needing both fast processors and huge data sets. The University of
Washington was the first to sign up and use resources provided by IBM and Google. Carnegie
Mellon University, MIT, Stanford University, the University of Maryland, and the University
of California at Berkeley, quickly followed suit. The universities immediately realized
computer experiments can be done faster and for less money, if IBM and Google were
supporting their research. Since much of the research was focused on problems IBM and
Google had interests in, they also benefitted from the arrangement. 2007 was also the year
when Netflix launched it’s streaming video service, using the cloud, and provided support for
the practice of “binge-watching.”
Eucalyptus offered the first AWS API compatible platform, which was used for distributing
private clouds, in 2008. In the same year, NASA’s OpenNebula provided the first open-source
software for deploying private and hybrid clouds. Many of its most innovative features focused
on the needs of major businesses.
Although private clouds were initiated in 2008, they were still undeveloped, and not very
popular. Concerns about poor security in public clouds was a strong driving force promoting
the use of private clouds. In 2010, companies like AWS, Microsoft, and OpenStack had
developed private clouds that were fairly functional. (2010 was also when OpenStack made an
open-sourced, free, do-it-yourself cloud, which became very popular, available to the general
public.)
The concept of hybrid clouds was introduced in 2011. A fair amount of interoperability is
needed between a private and public cloud, and the ability to shift workloads back and forth
between the two clouds. At this time, very few businesses had systems capable of doing this,
though many wanted to, because of the tools and storage public clouds could offer.
In 2011, IBM introduced the IBM SmartCloud framework, in support of Smarter Planet (a
cultural thinking project). Then, Apple launched the ICloud, which focuses on storing more
personal information (photos, music, videos, etc.). Also, during this year, Microsoft began
advertising the cloud on television, making the general public aware of its ability to store
photos, or video, with easy access.
Oracle introduced the Oracle Cloud in 2012, offering the three basics for business, IaaS
(Infrastructure-as-a-Service), PaaS (Platform-as-a-Service), and SAAS (Software-as-a-
Service). These “basics” quickly became the norm, with some public clouds offering all of
these services, while other focused on offering only one. Software-as-a-service became quite
popular.
CloudBolt was founded in 2012. This company gets credit for developing a hybrid cloud
management platform that helped organizations build, deploy and manage both private and
public clouds. They resolved the interoperability problems between public and private clouds.
Multi-clouds began when organizations started using SaaS providers for certain services, such
as human resources, customer relations management, and supply chain management. This
started becoming popular in roughly 2013-2014. While this use of SaaS providers is still quite
popular, a philosophy of using multiple clouds for their specific services and advantages has
developed. This philosophy includes not becoming trapped into using a specific cloud because
of “interoperability problems.”
By 2014, cloud computing had developed its basic features, and security had become a major
concern. Cloud security has become a fast-growing service, because of its importance to
customers. Cloud security has advanced significantly in the last few years, and now provides
protection comparable to traditional IT security systems. This includes the protection of critical
information from accidental deletion, theft, and data leakage. Having said that, security is, and
may always be, the primary concern of most cloud users.
Currently, one of the primary users of cloud services are application developers. In 2016 the
cloud began to shift from developer-friendly to developer-driven. Application developers
began taking full advantage of the cloud for the tools it had available. Many services strive to
be developer-friendly to draw more customers. Realizing the need, and the potential for profit,
cloud vendors developed (and continue to develop) the tools apps developers want and need.
5. Cloud Architecture
1. Frontend
2. Backend
3. Cloud-Based Delivery and Network
1. Frontend
Frontend of the cloud architecture refers to the client side of cloud computing system. Means
it contains all the user interfaces and applications which are used by the client to access the
cloud computing services/resources. For example, use of a web browser to access the cloud
platform.
2. Backend
Backend refers to the cloud itself which is used by the service provider. It contains the resources
as well as manages the resources and provides security mechanisms. Along with this, it includes
huge storage, virtual applications, virtual machines, traffic control mechanisms, deployment
models, etc.
• Applications: Back-end apps are the software or platforms that deliver the client
service requests on the front-end.
• Cloud computing service: The back-end service provides utility in cloud architecture
and manages the accessibility of cloud-based resources (such as, cloud-based storage
services, application development services, web services, security services, and more).
• Cloud storage: Cloud storage in the back-end refers to the flexible and scalable storage
service and management of data stored to carry out applications.
• Security tools: Security tools provide the back-end security (also referred to as service-
side security) for potential cyberattacks or system failures. Virtual firewalls protect web
applications, prevent data loss and ensure backup and disaster recovery. Back-end
components include encryption, access restriction and authentication protocols to
protect data from breaches.
• Public Cloud: Services offered over the public internet (e.g., AWS, Azure).
• Hybrid Cloud: Combines public and private clouds, allowing data portability.
They vary in their levels of accessibility, security risks, costs, and other factors, and each one
carries its own advantages and disadvantages. Selecting the type that aligns with the business
interests is crucial for making sure the cloud operations are a success, so make sure to
understand the differences before jump in.
The most common type of cloud infrastructure, public cloud computing consists of a wide
number of users accessing a host of remote servers that are connected to form a single all-
encompassing network.
In the public cloud model, you’re allotted a portion of your CSP’s cloud infrastructure, with
your data and other digital assets being stored within the environment. Think of it as renting
someone else’s digital space and allowing them to take care of the maintenance.
Public cloud computing offers several advantages, primarily due to its versatility and scale.
Large CSPs such as Microsoft and Amazon have the capacity to host a wide number of users
simultaneously, regardless of their clients’ bandwidth or demands.
The result is that public cloud computing models can likely provide the flexibility and scope
of services that individual users will need. The mass tenancy coupled with threat actors’
intensified interest in targeting public clouds does make security a concern, but top-level CSPs
should have the cybersecurity resources needed to ward off threats. And since threat actors are
likely to attack both public and private clouds, choosing an established CSP may be the best
security strategy available.
While public cloud infrastructure allows a wide number of users to access the same remote IT
environment, private clouds are reserved for individual users, departments, or organizations. A
public cloud becomes a private cloud once it’s secured behind a firewall or with some other
restrictive measure. Some CSPs have even begun installing bare-metal infrastructure
components on-site, further blurring the line between the public and private cloud models.
The main benefit of the private cloud model is that organizations own their entire IT
environment, so all components are reserved just for them. This can potentially mean greater
security since there are fewer users accessing your cloud, but that holds true only if you
implement the cloud security practices needed to minimize your attack surface.
The other benefit of having all of your resources devoted solely to your operations is that your
cloud will have the stability it needs to meet your fluctuating usage and demands. You won’t
run the risk of bumping into latency or downtime issues, as you could when a large number of
users all flock to a public cloud at once, so private cloud users experience greater business
continuity since their stack is always devoted to them.
In an effort to deliver the best of both worlds, hybrid clouds combine the devoted resources of
a private cloud with the managed infrastructure that a public cloud has to offer. CSPs offering
a hybrid cloud structure implement a “pay as you go” business model, through which
companies can keep their important data within their own on-prem infrastructure and send any
excess to their public cloud.
The advantage of hybrid cloud environments is that they allow companies to maintain their
own baseline security and resources while giving them the chance to scale up whenever they
need to. If their bandwidth or data storage needs grow seasonally or scale over time, companies
may pay for additional services while preserving their own private environment — and they
can scale back down if their needs ever drop in the future.
Another advantage is that companies using the hybrid cloud model are charged only for the
services they use. Instead of being tied down to an arbitrary service level, organizations can
purchase all of the services they need and nothing they don’t — all while preserving their
privacy.
A subset of the hybrid cloud model, community clouds consist of a smaller number of users
who have agreed to share the responsibilities and resources corresponding to their cloud. These
users often have similar attributes, such as a shared industry or regulations, so they band
together to manage their own cloud environment instead of sharing their digital space with just
anyone.
Maintaining your own cloud can be a resource-intensive task. Community clouds lighten this
load by distributing the responsibility among users with common interests, simplifying
everyone’s cloud maintenance efforts in the process.
Another benefit is that community clouds let you share your cloud with a select number of
trusted users, so you enjoy better security than what public clouds can offer. These
organizations often share common industry standards, making them likely to implement similar
cloud security policies, which can make your own compliance efforts that much easier.
• SaaS (Software as a Service): Software delivered over the internet (e.g., Gmail).
• PaaS (Platform as a Service): Platform for developing, testing, and deploying apps
(e.g., Google App Engine).
Software-as-a-Service (SaaS)
Users or organizations can access the software using the internet and any browser. There
is no need to purchase costly software and install bulky software on your own system
in order to use it. SaaS is also known as “On-Demand Software.”
Advantages of SaaS
Here are some of the top advantages of using the SaaS cloud service model:
• Simple deployment: Using the SaaS cloud service model, users or organizations
can use bulky and costly software without purchasing or downloading it on their
systems.
• Saves money: Users and organizations don’t have to purchase or maintain the
software, saving them a ton of money.
Disadvantages of SaaS
Here are some of the top disadvantages of using the SaaS cloud service model:
In short, customers should choose this cloud service model when they want to access
software without purchasing or managing it on their own system. It is best suited for
customers who only want to use different software without any hassle.
Infrastructure-as-a-Service (IaaS)
Advantages of IaaS
Here are some of the top advantages of using the IaaS cloud service model:
• Highly flexible: Users or organizations using IaaS can quickly scale and provision
computing resources as per their requirements.
Disadvantages of IaaS
Here are some of the top disadvantages of using the IaaS cloud service model:
In short, customers should choose this cloud service model when they want more
flexibility and control over their computing infrastructure. It is best suited for customers
who don’t want to invest in or maintain their data centers.
Platform-as-a-Service (PaaS)
Same as in the SaaS cloud service model, the cloud provider manages and maintains
the infrastructure without letting the customer worry about these things. Due to this,
users can use that time for more productive tasks.
Advantages of PaaS
Here are some of the top advantages of using the PaaS cloud service model:
• Simple deployment: Using the PaaS cloud service model, users or organizations
can focus on developing their applications without worrying about the underlying
structure.
• Saves money: Users and organizations don’t have to purchase or maintain the tools
or platforms, saving them a ton of money.
Disadvantages of PaaS
Here are some of the top disadvantages of using the PaaS cloud service model:
• Less control: Platform is managed and maintained by a cloud provider, limiting the
customer’s control over it compared to using their own data centers or platform.
In short, customers should choose this cloud service model when they want to focus on
developing and deploying their own applications without having to worry about
managing the underlying platform. It is best suited for customers who want a simplified
and streamlined development environment while having better control and
customization options compared to SaaS.
Function-as-a-Service (FaaS) :
FaaS allows users to execute code in response to events without managing servers. It is
also called Serverless computing.
Benefits:
Advantages:
• No server management
• Reduces time-to-market
• Oracle Cloud: Enterprise-grade cloud services for databases and apps. These
companies dominate the global cloud market with continuous innovations.
Amazon Web Services, launched in 2006 by Amazon, is the pioneer and global leader in cloud
computing. AWS started with basic infrastructure services like EC2 and S3 and has since grown
into a full-fledged cloud ecosystem offering over 200 fully-featured services. It supports
compute, storage, networking, database, machine learning, analytics, and security services.
AWS's massive global infrastructure, with availability in multiple regions and zones, ensures
low latency and high fault tolerance. It has become the preferred platform for startups,
enterprises, and governments due to its reliability, scalability, and innovation. Advantages
include pay-as-you-go pricing, strong support for hybrid architecture, and an extensive
developer community. Enterprises choose AWS for its performance, compliance standards, and
continuous rollout of new features.
2. Microsoft Azure
Microsoft Azure was officially launched in 2010 as a cloud computing platform and
infrastructure created by Microsoft for building, deploying, and managing applications. Azure
integrates seamlessly with Microsoft software like Windows Server, Active Directory, and
Office 365, making it popular among enterprise users. Azure offers services across IaaS, PaaS,
and SaaS, supporting virtual machines, AI, analytics, IoT, and DevOps tools. Its hybrid cloud
model with tools like Azure Stack allows businesses to deploy services across public and
private environments. Azure’s global network of data centers ensures low latency and reliable
uptime. Its advantages include enterprise-grade security, support for open-source platforms,
built-in disaster recovery, and easy migration for Windows-based systems.
Google Cloud Platform (GCP), introduced in 2008, is known for its strong data analytics,
machine learning, and AI capabilities. Google uses the same infrastructure for GCP that powers
its popular services like YouTube and Gmail. GCP provides tools like BigQuery for big data
analytics, TensorFlow for ML, and App Engine for PaaS-based application development. GCP
has gained popularity in academia, startups, and digital-first companies that require strong AI
and data capabilities. Its strengths lie in pricing models (sustained use discounts), high-speed
fiber networks, security layers, and container management with Kubernetes, which Google
invented. Advantages include innovation in AI/ML, sustainability with carbon-neutral data
centers, and ease of integration with open-source tools.
4. IBM Cloud
IBM Cloud combines platform as a service (PaaS) with infrastructure as a service (IaaS) and
offers a wide range of services including AI (via Watson), blockchain, IoT, and DevOps. IBM
has a strong presence in hybrid cloud solutions and enterprise-level services. Its acquisition of
Red Hat has further strengthened its open-source and Kubernetes offerings. IBM Cloud
supports multicloud environments and helps large organizations in regulated industries manage
complex workloads. The company also focuses on compliance, privacy, and trust. Advantages
include deep integration with legacy systems, AI-driven analytics, flexible deployment models,
and strong support for enterprise security and governance.
5. Oracle Cloud
Oracle Cloud is known for its robust database solutions and enterprise software integration.
Launched in the early 2010s, it offers services in IaaS, PaaS, SaaS, and DaaS (Data as a
Service). It is tailored for enterprise applications such as Oracle ERP, HCM, and CRM. Oracle
Cloud Infrastructure (OCI) provides high-performance computing, autonomous databases, and
AI-powered analytics. It supports hybrid and multicloud deployments with strong security and
compliance. Advantages include deep integration with Oracle software, advanced database
tuning features, high availability, and industry-specific cloud applications. Oracle Cloud is
often favored by financial services, healthcare, and government organizations for its
performance and enterprise readiness.
Cloud computing has revolutionized IT infrastructure and software delivery, but it also brings
several challenges. Understanding these issues is essential for organizations to effectively plan,
manage, and secure their cloud environments.
• Data Security and Privacy - Storing data in the cloud raises concerns about
unauthorized access, data breaches, and loss of sensitive information. Cloud providers
store data on shared infrastructure, which increases the risk. For example, in 2019,
Capital One experienced a massive data breach when over 100 million customer records
stored in AWS were accessed due to a misconfigured firewall.
• Downtime and Availability - Cloud services can suffer from outages or service
interruptions, impacting business operations. This can result in data inaccessibility or loss
of productivity. For example, In 2020, Google Cloud experienced an outage that affected
services like Gmail, YouTube, and Google Docs globally.
• Vendor Lock-in - Migrating from one cloud provider to another can be complex due to
differences in platforms, APIs, and tools. This leads to dependency on a single provider.
For example, A company using Azure-specific functions may find it difficult and costly
to move to AWS or GCP due to code and configuration incompatibility.
• Compliance and Legal Issues - Different countries have different data protection laws
(e.g., GDPR in the EU). Ensuring cloud services comply with legal requirements is
essential but challenging, For instance, Companies storing EU citizens' data must comply
with GDPR. A US-based company using a non-compliant cloud provider may face legal
action and fines.
• Limited Control - In cloud environments, users do not have direct control over
infrastructure. This lack of visibility and control can affect performance tuning and
customization. For example, an organization relying on AWS Lambda (serverless) has
limited ability to control execution environment and performance parameters.
• Data Loss and Recovery - Although cloud providers often offer backup and recovery
services, accidental deletions, corrupted backups, or misconfigurations can still lead to
data loss. For example, Dropbox once accidentally deleted files due to a bug in its sync
mechanism. Though most files were recoverable, some were lost permanently.
• Cost Management - Cloud services follow a pay-as-you-go model, but without proper
monitoring, costs can spiral due to unnecessary resource usage or unexpected traffic
spikes. For example, a startup left unused virtual machines running in AWS, leading to a
large monthly bill despite low actual usage.
• Latency and Bandwidth Issues - Cloud applications rely on internet connectivity.
Network latency or low bandwidth can degrade the performance of time-sensitive
applications. For example, Gaming companies using cloud platforms to stream high-
performance games (like Google Stadia) struggle with latency issues in areas with poor
internet.
• Shared Technology Vulnerabilities - Cloud environments are multi-tenant. A
vulnerability in the shared hardware, hypervisor, or network could affect multiple
customers. For example the Spectre and Meltdown CPU vulnerabilities affected cloud
platforms globally in 2018, requiring major architectural changes.
• Insider Threats - Employees of the cloud provider or the client organization may misuse
access rights, leading to data theft or sabotage. For example, a rogue system
administrator at a cloud provider might access sensitive customer data without detection
if security protocols are weak.
• CloudSim is not a deployment tool, but a simulation toolkit for research and testing.
10. Eucalyptus
Eucalyptus, short for Elastic Utility Computing Architecture for Linking Your Programs to
Useful Systems, is an open-source software platform that enables the creation of private and
hybrid cloud environments. Developed at the University of California, Santa Barbara
(UCSB) in 2008, it was among the earliest platforms to offer cloud infrastructure services that
mimic the features of Amazon Web Services (AWS), making it a cost-effective and secure
solution for organizations seeking cloud-like benefits within their own data centers.
At the hardware level, Node Controllers (NCs) are installed on physical servers and are
responsible for running the virtual machines. They interact with the underlying hypervisor
(such as KVM, Xen, or VMware) to manage VM lifecycle operations like start, stop, or
terminate. Walrus, another important component of Eucalyptus, provides object storage,
acting like AWS S3 by storing files, snapshots, and images. Additionally, the Storage
Controller (SC) offers block storage similar to AWS Elastic Block Store (EBS), used for
attaching persistent storage volumes to virtual machines.
One of the major benefits of Eucalyptus is that it provides fine-grained control over cloud
infrastructure, making it ideal for organizations that need to comply with strict security or data
residency regulations. Since the infrastructure resides on-premises, users can ensure
compliance while still enjoying the elasticity of cloud computing. It also supports multi-
tenancy, enabling multiple users or departments to use the same infrastructure independently.
Eucalyptus integrates with various identity management and security tools, and it offers both
command-line tools and web-based dashboards for cloud administration. Its open-source
nature allows for customization and integration with third-party systems, including monitoring
and orchestration tools.
However, with the rapid evolution of cloud technologies, Eucalyptus faced stiff competition
from platforms like OpenStack, CloudStack, and fully managed public clouds. Eventually,
Eucalyptus was acquired by Hewlett-Packard (HP) in 2014 to enhance its Helion cloud
product line. Although not as widely used today, Eucalyptus remains significant in cloud
computing history for its role in pioneering private cloud solutions.
In summary, Eucalyptus offers a reliable and AWS-compatible private cloud platform, allowing
organizations to manage their infrastructure efficiently while maintaining full data sovereignty.
Its modular architecture, API compatibility, and control mechanisms make it a strong choice
for hybrid cloud strategies and academic or research-oriented deployments.
11. Nimbus
while also enabling integration with existing grid infrastructures. It acts as a bridge between
cloud computing and grid computing, which is vital for researchers who rely on shared
scientific infrastructure.
One of the unique aspects of Nimbus is its focus on user control and customization. Unlike
commercial cloud platforms, Nimbus allows users to upload their own virtual machine images
(VMIs), which can be pre-configured with specific software environments. This feature is
highly useful in scientific computing, where reproducibility and environment control are
crucial. Researchers can deploy their own VMIs with specific libraries, tools, and
configurations, and run experiments with minimal dependency on the host system.
Nimbus supports standard interfaces such as the Amazon EC2 API, WSRF (Web Services
Resource Framework), and OGSA (Open Grid Services Architecture). This ensures
compatibility and interoperability with other cloud and grid systems. Additionally, it includes
a component called the Context Broker, which is used for dynamic contextualization—
configuring a VM after it is launched. This allows users to automate setup tasks like setting
passwords, configuring IP addresses, and initializing applications.
• Workspace Service: This is the main IaaS component responsible for managing the
life cycle of virtual machines (start, stop, terminate).
• Cloud Client: A command-line tool that users utilize to interact with the Nimbus
system.
Nimbus also places strong emphasis on security and access control, leveraging tools like
X.509 certificates, grid security infrastructure (GSI), and other authentication mechanisms
common in scientific computing environments. This ensures that only authorized users can
deploy and manage virtual machines, making it a secure option for multi-user systems.
While Nimbus is not as widely adopted in enterprise environments as platforms like OpenStack
or Eucalyptus, its value lies in its academic and research orientation. It helped pave the way
for academic institutions to experiment with cloud infrastructure, long before commercial
cloud offerings became dominant. Many of its concepts—like customizable VMs, context-
aware deployment, and integration with scientific grids—have influenced the design of more
modern research-oriented cloud solutions.
In recent years, the development and usage of Nimbus have declined due to the emergence of
more comprehensive platforms like OpenStack. However, Nimbus remains an important
milestone in the evolution of cloud computing for scientific research. It provided a lightweight,
flexible, and open alternative to proprietary cloud services, empowering researchers to create
their own virtual laboratories in the cloud.
12. OpenNebula
OpenNebula is an open-source cloud computing platform that enables the creation and
management of private, public, and hybrid clouds. It is widely recognized for its simplicity,
flexibility, and powerful features that allow organizations to manage virtualized data centers
and IaaS infrastructures efficiently. Developed initially by the OpenNebula.org community
and first released in 2008, it has been widely adopted in enterprise and research environments.
• Frontend (OpenNebula Core): Acts as the central management point, handling all
operations like VM deployment, monitoring, and storage.
• Sunstone: A web-based GUI that allows administrators and users to manage their cloud
resources visually.
One of OpenNebula’s key strengths is its simplicity. It has a low learning curve and can be
deployed quickly even in smaller organizations. It is also known for being lightweight, which
makes it ideal for edge computing environments where resources are constrained.
Additionally, OpenNebula supports multi-tenancy, resource quotas, user roles, and virtual
networks, making it highly suitable for academic institutions and enterprise IT departments. It
also emphasizes security and customization, allowing administrators to tightly control user
access and configure the system to meet specific organizational policies.
13. CloudSim
CloudSim enables researchers to simulate cloud data centers, hosts, virtual machines (VMs),
network behavior, and cloud service brokers. It provides a layered architecture that separates
simulation from actual implementation, which allows experimentation without the need to use
costly cloud infrastructure or hardware.
• Datacenter Broker: Manages the submission of user requests and schedules them to
VMs.
The major benefit of CloudSim is that it saves time, cost, and resources, as users do not need
actual infrastructure for testing. However, its main limitation is that it is a simulation
environment—meaning that it may not fully capture all the complexities and failures of real-
world systems.
Primary Enable IaaS for scientific Manage Private, Public & Simulate & Analyze Cloud
Build Private & Hybrid Clouds
Purpose applications Hybrid Clouds Computing Environments
Programming
Java Java C++, Ruby, JavaScript Java
Language
Modular (CLC, CC, NC, Modular (Workspace Service, Modular (Core, Sunstone, Simulation Layered
Architecture
Walrus, SC) Context Broker, etc.) CLI, APIs) Architecture
Yes (customizable
VM Image Yes (supports custom machine Yes (user-defined VM images Yes (as cloudlets/tasks in
templates, Docker
Support images) for experiments) simulations)
support)
Mainly Private Cloud for Private, Public & Hybrid Not a real cloud -
Cloud Model Private & Hybrid Cloud
Grid Integration Cloud simulation only