Cloud Computing
Cloud Computing
computation, organizing data, designing systems of higher levels, and solving Architecture Of Cloud Computing
problems using computers. Cloud computing arhitecture refers to the components and sub-components
Characteristics of Computing Paradigms required for cloud computing. These components typically refer to:
It encompasses the principles, techniques, methodologies, and architectures that 1-Front End ( User Interaction Enhancement ): The User Interface of Cloud
guide the design, development, and deployment of computational systems. Computing consists of 2 sections of clients. The Thin clients are the ones that use
Computing paradigms can vary widely based on factors such as the underlying web browsers facilitating portable and lightweight accessibilities and others are
hardware, programming models, and problem-solving strategies. known as Fat Clients that use many functionalities for offering a strong user
Each computing paradigm offers different advantages, trade-offs, and suitability for experience.
specific types of problems and applications. 2. Back-end Platforms ( Cloud Computing Engine ): The core of cloud computing is
The choice of paradigm depends on factors such as the nature of the problem, made at back-end platforms with several servers for storage and processing
performance requirements, scalability, and ease of development. computing. Management of Applications logic is managed through servers and
Many modern computing systems and applications combine multiple paradigms to effective data handling is provided by storage. The combination of these platforms at
use their respective strengths and address complex challenges. the backend offers the processing power, and capacity to manage and store data
Types of High-Performance Computing Paradigm behind the cloud.
High-performance computing (HPC) refers to the use of advanced computing 3. Cloud-Based Delivery and Network: On-demand access to the computer and
techniques and technologies to solve complex problems and perform data-intensive resources is provided over the Internet, Intranet, and Intercloud. The Internet comes
tasks at speeds beyond what a conventional computer could achieve. with global accessibility, the Intranet helps in internal communications of the
Different high-computing performance paradigms arise from various methodologies, services within the organization and the Intercloud enables interoperability across
principles, and technologies used to solve complex computational problems. various cloud services. This dynamic network connectivity ensures an essential
Each computing paradigm has its strengths, weaknesses, and specific use cases. The component of cloud computing architecture on guaranteeing easy access and data
choice of use of a paradigm depends on the nature of the problem to be solved, transfer.
performance requirements, and the available technology. Some of the fundamental blocks of Cloud Computing are Compute, Storage,
There are some key computing paradigms are as follows:- Database, Networking, and Security.
Sequential Computing: This is a traditional paradigm that involves the execution of Compute: Instead of provisioning your server in a local data center, you can
instructions in a linear sequence.It is the foundation of classical computing outsource the computing power needed by your server from a cluster of virtual
architectures, where a processor executes one instruction at a time. machines in the Cloud. Compute is the processing power required by applications
Parallel Computing: Parallel computing is a computing paradigm where multiple and systems to process data and carry out computational tasks.
computations or processes are executed simultaneously to solve a single problem, Storage: The main benefit of storing data in the cloud is the convenience of
typically to improve performance, efficiency, and scalability. increasing your storage capacity without maintaining and buying more local hard
In parallel computing, multiple processors or multi-cores work simultaneously on drives. You cannot prevent data corruption from happening in the event of a hard
different parts of a problem. disk failure. In the cloud, your data is stored persistently across logical pools in
In parallel computing, tasks are divided into smaller subtasks that can be executed physical storage hosted by your cloud service provider. You can store different types
concurrently on multiple processing units or cores, allowing for faster execution and of data such as objects, files, and backups.
higher throughput compared to sequential processing. Database:The database is a system that stores and manages structured and
Parallel computing is used in various domains, including scientific simulations, data unstructured information. Databases in the cloud are typically managed and offered
analytics, image and signal processing, artificial intelligence, and computer graphics. as a service by a cloud service provider. This means that maintaining and updating
Examples of parallel computing applications include weather forecasting, molecular the underlying components of your database instance, such as OS updates and
dynamics simulations, genome sequencing, deep learning training, and rendering software patches, are no longer your responsibility. Databases in the cloud are also
complex 3D graphics. scalable and highly available in nature.
Distributed/Network Computing: Distributed computing involves the use of multiple Networking: The cloud is a large ecosystem of computers that communicate and
computers connected to a network.Tasks are distributed across these distributed integrate with each other to deliver a specific service to customers. Cloud service
computers, and they work collaboratively to achieve a common goal. providers make sure that they always maintain a high-speed network connection
Network computing, also known as distributed computing, refers to the use of within their infrastructures to support the needs of their end-users. You can use the
interconnected computers and resources to perform tasks collaboratively over a cloud to provide a global link to distribute your application all over the world.
network. Security: In the cloud, data is stored in secured remote data center facilities. This
This infrastructure can include local area networks (LANs), wide area networks means that threats like theft and data breach are unlikely going to happen. As a
(WANs), and the Internet. cloud user, your responsibility is more on data management. The cloud has sets of
The client-server model is a common architecture in network computing. tools to help you enforce high levels of security. For example, you have control on
Network computing allows users to access resources and applications remotely. the encryption and decryption of your data. You can also choose to authenticate and
Resources such as processing power, storage, and applications are distributed across authorize selected users and services to access your applications.
multiple computers within the network. This enables users to access and utilize Principles of Cloud Computing:The term cloud is usually used to represent the
resources located on different machines. internet but it is not just restricted to the Internet. It is virtual storage where the
Network computing facilitates collaboration among users by enabling them to share data is stored in third-party data centers. Storing, managing, and accessing data
files, work on documents simultaneously, and communicate in real time. present in the cloud is typically referred to as cloud computing
Collaboration tools, such as email, video conferencing, and collaborative document Federation: A cloud computing environment must be capable of providing federated
editing, are common in networked environments. service providers which means that, these providers, must be capable of
Network computing systems can be easily scaled by adding more computers or collaborating and resource sharing at any point irrespective of their type. This is
resources to the network. This scalability allows organizations to adapt to changing usually needed when an organization extends its computing paradigm from the
demands and accommodate growing workloads. private to the public cloud. Moreover.
Cloud computing is an example of distributed computing. Independence: The user of cloud computing services must be independent of the
Client-Server Computing: This paradigm involves dividing computing tasks between provider's specific tool and the type of service. According to this principle, a user
client devices (user interfaces) and server systems that store data and manage must be allowed the required virtual resource irrespective of the type of provider.
resources.Client-server computing is commonly used in networked applications, Moreover, it is the responsibility of service providers to handle infrastructure while
where clients request services from servers.In Client-Server computing, a client is a hiding confidential information.
software application or device that requests services or resources from a server Elasticity: The user of cloud computing must be provided with ease of accessing and
whereas a server is a software application or hardware device that provides services releasing the resources as required. This is typically referred to as elasticity. The rules
or resources to clients.Clients are typically end-user devices such as computers, associated with elasticity must be included within the contract made between
smartphones, tablets, or IoT devices whereas Servers are responsible for processing consumers and services providers.
client requests, performing computations, managing data, and returning results to : Characteristics of Cloud Computing
clients.Clients initiate communication with servers by sending requests for data, There are many characteristics of Cloud Computing here are few of them :
processing, or other services whereas servers are usually high-performance On-demand self-services: The Cloud computing services does not require any human
computers or specialized hardware optimized for handling multiple client administrators, user themselves are able to provision, monitor and manage
connections and processing tasks efficiently.Common communication protocols used computing resources as needed.
in client-server computing include HTTP (Hypertext Transfer Protocol) for web Broad network access: The Computing services are generally provided over standard
applications, SMTP (Simple Mail Transfer Protocol) for email, FTP (File Transfer networks and heterogeneous devices.
Protocol) for file transfer, and TCP/IP (Transmission Control Protocol/Internet Security: Cloud providers invest heavily in security measures to protect their users'
Protocol) for general network communication. data and ensure the privacy of sensitive information.
The client-server interaction follows a request-response model, where clients send Resource pooling: The IT resource (e.g., networks, servers, storage, applications, and
requests to servers, and servers respond with the requested data or perform the services) present are shared across multiple applications and occupant in an
requested actions. uncommitted manner. Multiple clients are provided service from a same physical
Clients may send various types of requests, such as HTTP GET requests for retrieving resource.
web pages, HTTP POST requests for submitting form data, or SQL queries to retrieve Measured service: The resource utilization is tracked for each application and
data from a database. occupant, it will provide both the user and the resource provider with an account of
what has been used. This is done for various reasons like monitoring billing and
effective use of resource.
A cloud ecosystem is defined as a complex system of cloud services, platforms, and
: Types of Cloud Computing Deployment Models infrastructure used for the storage, processing, and distribution of data and
The cloud deployment model identifies the specific type of cloud environment based applications through the Internet. It consists of multiple parts: cloud providers,
on ownership, scale, and access, as well as the cloud's nature and purpose. The software developers, users, and other services, which are integrated into a prolific
location of the servers you're utilizing and who controls them are defined by a cloud and adaptable architecture for computing assets. This ecosystem enhances the
deployment model. It specifies how your cloud infrastructure will look, what you can ability of businesses and individuals to lease computational solutions at will, in line
change, and whether you will be given services or will have to create everything with flexibility, innovation and cost sensitivity in the digital frontier.
yourself. Relationships between the infrastructure and your users are also defined by Cloud Ecosystem Work :Hub and Spoke Model: Instead, the cloud ecosystem is a
cloud deployment types. Different types of cloud computing deployment models are hub-and-spoke architecture that has a cloud provider in the centre linking with a
described below. variety of entities.
Public Cloud :The public cloud makes it possible for anybody to access systems and Central Cloud Provider: In general Cloud Ecosystem Architecture, there is a central
services. The public cloud may be less secure as it is open to everyone. The public place and here, the main components of the public cloud are inside with AWS as a
cloud is one in which cloud infrastructure services are provided over the internet to core.
the general people or major industry groups. The infrastructure in this cloud model is Interconnected Relationships: There are many interconnecting spouses of the central
owned by the entity that delivers the cloud services, not by the consumer. It is a type cloud provider such as the companies that supply software and equipment,
of cloud hosting that allows customers and users to easily access systems and consultants, and third-party service providers.
services. This form of cloud computing is an excellent example of cloud hosting, in Complex Interactions: AWS, being a cloud services provider, provides support to
which service providers supply services to a variety of customers. In this multiple applications and has partnerships with other organizations, making the
arrangement, storage backup and retrieval services are given for free, as a interaction dynamics rather intricate within the ecosystem.
subscription, or on a per-user basis. For example, Google App Engine etc. requirements for cloud services:
Private Cloud ;The private cloud deployment model is the exact opposite of the 1. Internet Connectivity:
public cloud deployment model. It's a one-on-one environment for a single user Reliable, high-speed internet is essential for accessing cloud services.
(customer). There is no need to share your hardware with anyone else. The 2. Scalability: Infrastructure must automatically scale based on demand.
distinction between private and public clouds is in how you handle all of the 3. Security:
hardware. It is also called the "internal cloud" & it refers to the ability to access Strong data protection, encryption, access control, and compliance with standards.
systems and services within a given border or organization. The cloud platform is 4. Reliability & Availability
implemented in a cloud-based secure environment that is protected by powerful Redundant systems, backups, and high uptime (e.g., 99.9% SLA) are crucial.
firewalls and under the supervision of an organization's IT department. The private 5. Monitoring & Management
cloud gives greater flexibility of control over cloud resources. Automated tools for performance tracking, alerts, and resource management.
Hybrid Cloud: By bridging the public and private worlds with a layer of proprietary 6. Compliance :Adherence to legal and industry regulations (e.g., GDPR, HIPAA).
software, hybrid cloud computing gives the best of both worlds. With a hybrid 7. Multi-Tenancy:
solution, you may host the app in a safe environment while taking advantage of the Secure separation of resources for different users in shared environments.
public cloud's cost savings. Organizations can move data and applications between 8. Resource Automation: Efficient provisioning, auto-scaling, and usage-based
different clouds using a combination of two or more cloud deployment methods, billing.
depending on their needs. 9. APIs & Interfaces: User-friendly dashboards and APIs for integration and
Community Cloud :It allows systems and services to be accessible by a group of automation.
organizations. It is a distributed system that is created by integrating the services of 10. Data Management: Reliable storage, backup, recovery, and data lifecycle
different clouds to address the specific needs of a community, industry, or business. handling.
The infrastructure of the community could be shared between the organization Cloud Computing Architecture:Architecture of cloud computing is the combination
which has shared concerns or tasks. It is generally managed by a third party or by the of both SOA (Service Oriented Architecture) and EDA (Event Driven Architecture).
combination [of one or more organizations in the community. Client infrastructure, application, service, runtime cloud, storage, infrastructure,
Models of Cloud Computing:Cloud Computing helps in rendering several services management and security all these are the components of cloud computing
according to roles, companies, etc. architecture.
1. Infrastructure as a service (IaaS): Infrastructure as a Service (IaaS) helps in The cloud architecture is divided into 2 parts, i.e.
delivering computer infrastructure on an external basis for supporting operations. 1. Frontend:Frontend of the cloud architecture refers to the client side of cloud
Generally, IaaS provides services to networking equipment, devices, databases, and computing system. Means it contains all the user interfaces and applications which
web servers. are used by the client to access the cloud computing services/resources. For
Infrastructure as a Service (IaaS) helps large organizations, and large enterprises in example, use of a web browser to access the cloud platform.
managing and building their IT platforms. This infrastructure is flexible according to 2. Backend: Backend refers to the cloud itself which is used by the service provider. It
the needs of the client. contains the resources as well as manages the resources and provides security
Advantages of IaaS mechanisms. Along with this, it includes huge storage, virtual applications, virtual
IaaS is cost-effective as it eliminates capital expenses. machines, traffic control mechanisms, deployment models, etc.
IaaS cloud provider provides better security than any other software. the User Layer and Client Layer refer to parts of the architecture that interact with
IaaS provides remote access. cloud services from the end-user’s side. Here’s a brief explanation of both:
Disadvantages of IaaS 1. User Layer: The User Layer represents the end-users who access cloud services via
In IaaS, users have to secure their own data and applications. applications or web interfaces.
Cloud computing is not accessible in some regions of the World. 🔹 Key Points:
2. Platform as a service (PaaS) :Platform as a Service (PaaS) is a type of cloud •Includes individuals or organizations using cloud-hosted applications.
computing that helps developers to build applications and services over the Internet •Interacts with the cloud through interfaces like browsers or apps.
by providing them with a platform.PaaS helps in maintaining control over their •Focuses on ease of use, user experience (UI/UX), and access control.
business applications. 🔹 Examples: •A person accessing Google Docs.
Advantages of PaaS 2. Client Layer:The Client Layer consists of the devices and software that users use to
PaaS is simple and very much convenient for the user as it can be accessed via a web interact with the cloud.
browser. 🔹 Key Points:
PaaS has the capabilities to efficiently manage the lifecycle. •Acts as the interface between users and the cloud infrastructure.
Disadvantages of PaaS •Includes desktops, smartphones, tablets, and client-side software (like browsers or
PaaS has limited control over infrastructure as they have less control over the custom apps).
environment and are not able to make some customizations. •Responsible for sending requests and displaying cloud data.
PaaS has a high dependence on the provider. 🔹 Examples:•A mobile app connecting to a cloud database.
3. Software as a service (SaaS) :Software as a Service (SaaS) is a type of cloud 🔁 Relationship Between User and Client Layer:
computing model that is the work of delivering services and applications over the User Layer Client Layer
Internet. The SaaS applications are called Web-Based Software or Hosted Software. Represents the end-users Represents the user’s devices or apps
SaaS has around 60 percent of cloud solutions and due to this, it is mostly preferred Focus on interaction Focus on connectivity and requests
by companies. Needs simplicity and UI Needs compatibility and performance
Advantages of SaaS The network layer is a part of the communication process in computer networks. Its
SaaS can access app data from anywhere on the Internet. main job is to move data packets between different networks. It helps route these
SaaS provides easy access to features and services. packets from the sender to the receiver across multiple paths and networks.
Disadvantages of SaaS Network-to-network connections enable the Internet to function. These connections
SaaS solutions have limited customization, which means they have some restrictions happen at thenetwork layer which sends data packets between different networks.
within the platform. In the 7-layer OSI model, the network layer is layer 3. The Internet Protocol (IP) is a
SaaS has little control over the data of the user. key protocol used at this layer, along with other protocols for routing, testing, and
SaaS are generally cloud-based, they require a stable internet connection for proper encryption
working. Advantages of Network Layer Services
Packetization service in the network layer provides ease of transportation of the data
packets.
Packetization also eliminates single points of failure in data communication systems.
Routers present in the network layer reduce network traffic by creating collision and •Use cloud-native solutions like AWS Backup, Google Cloud Snapshot, etc.
broadcast domains. With the help of Forwarding, data packets are transferred 6. Cost Management
from one place to another in the network. •Monitor usage and optimize resources to avoid over-provisioning.
Cloud computing management is maintaining and controlling the cloud services and •Use cloud provider tools like AWS Cost Explorer, Azure Cost Management.
resources be it public, private or hybrid. Some of its aspects include load balancing, 7. Logging and Auditing
performance, storage, backups, capacity, deployment etc. To do so a cloud managing •Enable centralized logging for tracking issues and user activity.
personnel needs full access to all the functionality of resources in the cloud. •Tools: ELK Stack (Elasticsearch, Logstash, Kibana), Fluentd, Cloud-native logs.
Different software products and technologies are combined to provide a cohesive Why Cloud Migration is Important
cloud management strategy and process.,,,,As we know Private cloud infrastructure Cloud migration is vital for businesses aiming to improve agility, reduce costs, and
is operated only for a single organization, so that can be managed by the enhance their IT infrastructure. By migrating to the cloud, businesses can:
organization or by a third party. Public cloud services are delivered over a network Enhance Flexibility: Cloud platforms provide on-demand resources, allowing
that is open and available for public use. In this model, the IT infrastructure is owned businesses to scale quickly based on need.
by a private company and members of the public can purchase or lease data storage Improve Cost Efficiency: Cloud services often reduce the need for large upfront
or computing capacity as needed. Hybrid cloud environments are a combination of investments in hardware and infrastructure, offering pay-as-you-go pricing models.
public and private cloud services from different providers. Most organizations store Boost Performance and Reliability: Cloud platforms provide high availability and
data on private cloud servers for privacy concerns, while leveraging public cloud disaster recovery solutions to ensure business continuity.
applications at a lower price point for less sensitive information. The combination of Accelerate Innovation: The cloud offers advanced services like machine learning, big
both the public and private cloud are known as Hybrid cloud servers. data analytics, and AI, enabling companies to innovate faster.
Public Cloud Access Networking and Private Cloud Access Networking: introduction to how Apache projects contribute to IaaS, PaaS, and SaaS:
🌐 Public Cloud Access Networking : Connects users to cloud services hosted on public IaaS (Infrastructure as a Service):IaaS provides virtualized computing resources over
platforms like AWS, Azure, or Google Cloud over the public internet. the internet. Open-source tools for IaaS allow organizations to build and manage
•Access: Open to multiple organizations (multi-tenant). their own private clouds or to interact with public cloud providers in a standardized
•Network Type: Internet-based, using standard protocols (HTTP, HTTPS, etc.). way.
•Security: Relies on provider’s built-in security (firewalls, encryption, IAM). * Apache CloudStack: This is a prominent open-source IaaS platform. CloudStack is
•Cost: Pay-as-you-go; no infrastructure investment required. designed to deploy and manage large networks of virtual machines as a highly
•Scalability: High, with elastic resource availability. available, highly scalable IaaS cloud. It provides a complete "stack" for IaaS, including
🔒 Private Cloud Access Networking: Connects users to cloud services hosted in a compute orchestration, network-as-a-service, user and account management, an
private environment, either on-premises or in a dedicated data center. API, resource accounting, and a user interface. Many public and private clouds are
•Access: Restricted to a single organization (single-tenant). powered by Apache CloudStack.
•Network Type: Internal corporate network or secure connections (VPN/MPLS). * Relationship to other IaaS tools: While OpenStack is another very popular open-
•Security: Controlled by the organization; supports strict compliance (e.g., HIPAA, source IaaS platform, Apache CloudStack focuses on being a turnkey solution for
financial regulations). building clouds. Many other tools like OpenNebula and Eucalyptus also exist in the
•Cost: Higher, due to infrastructure and maintenance responsibilities. open-source IaaS space.
•Scalability: Moderate, based on internal capacity. PaaS (Platform as a Service): PaaS provides a platform that allows developers to
•Example: Employees accessing a company’s internal cloud system via a secure VPN. build, run, and manage applications without the complexity of building and
A cloud application (often shortened to "cloud app") is a software program that is maintaining the infrastructure typically associated with developing and launching an
deployed, hosted, and managed in a cloud environment, rather than being installed app.
and run on a local server or individual device. This means that instead of relying on Unit 3
your computer's resources, the application leverages the computing power, storage, technological drivers for cloud computing :
and other services provided by a third-party cloud service provider (like AWS, Google * Virtualization: Divides single physical servers into multiple isolated virtual
Cloud, Microsoft Azure, etc.). machines (VMs), maximizing hardware utilization, providing isolation, and enabling
How Cloud Applications Work: rapid resource provisioning.
Cloud applications operate on a front-end and back-end model: * Broadband Internet & Network Connectivity: High-speed, reliable internet is
* Front-End: This is the user interface that you interact with, typically through a web essential for accessing remote cloud resources, ensuring performance, and enabling
browser or a dedicated mobile app. It's the part of the application that runs on your global accessibility.
device. * Distributed Systems & Parallel Processing: Allows multiple interconnected
* Back-End: This is the core of the cloud application, residing on remote servers in computers to work together, providing massive scalability, high fault tolerance, and
data centers owned and managed by the cloud service provider. The back-end efficient processing of large datasets.
handles: * Service-Oriented Architecture (SOA) & Microservices: Deconstructs applications
* Data Storage: All the application's data is stored securely in the cloud. into independent, loosely coupled services, enhancing modularity, agility (DevOps),
* Processing: The heavy lifting of the application's logic and computations happens and individual service scalability.
on these remote servers. * Automation & Orchestration: Automates and coordinates complex tasks like
* Middleware: Software that enables communication and data management resource provisioning, deployment, scaling, and monitoring, leading to efficiency,
between the front-end and back-end. consistency, and self-service capabilities.
Benefits of Applications in the Cloud: * Advancements in Hardware: Continuous improvements in multi-core processors,
1.Accessibility – Access from anywhere via internet. SSDs, and networking hardware provide the raw computing power and storage
2.Scalability – Easily scale resources up or down. density needed for large-scale cloud data centers.
3.Cost-Effective – Pay only for what you use; no hardware needed. explanation of Service-Oriented Architecture (SOA) and its relationship with Cloud
4.Automatic Updates – No manual software updates required. Computing, as well as an overview of SOC-as-a-Service (SOCaaS):
5.Collaboration – Real-time sharing and teamwork. 🧩 Service-Oriented Architecture (SOA) and Cloud Computing
6.Disaster Recovery – Built-in backup and recovery options. SOA is a software design approach where applications are structured as a collection
❌ Drawbacks of Applications in the Cloud : of loosely coupled, reusable services that communicate through standard protocols
1.Internet Dependency – No access without a stable connection. and interfaces. Each service represents a discrete business function and operates
2.Security Risks – Data stored on third-party servers. independently, allowing teams to develop, deploy, and scale services separately
3.Limited Control – Provider manages most infrastructure and policies. while promoting reuse across different applications.
4.Downtime – Service interruptions depend on provider. In the context of Cloud Computing, SOA principles are applied to design modular,
5.Vendor Lock-in – Difficult to migrate to other platforms. scalable, and interoperable services. Cloud platforms like AWS, Azure, and Google
Managing Cloud Applications :Managing a cloud application involves ensuring that Cloud offer services that adhere to SOA principles, enabling organizations to build
the application is secure, scalable, available, and performing well across its lifecycle flexible and maintainable systems that can adapt to changing business needs.
— from deployment to monitoring and updating. 🛡 SOC-as-a-Service (SOCaaS): SOCaaS is a cloud-based security model where a third-
✅ Key Aspects of Managing a Cloud Application party vendor operates and maintains a fully-managed Security Operations Center
1. Deployment Management (SOC) on a subscription basis. It provides all the security functions performed by a
•Use tools like CI/CD pipelines for smooth deployment. traditional, in-house SOC, including:
•Automate using platforms like Jenkins, GitHub Actions, or GitLab CI. •Network Monitoring •Log Management •Threat Detection and Intelligence
•Use container orchestration (e.g., Docker, Kubernetes) for scalable deployments. •Incident Investigation and Response •Reporting
2. Monitoring and Performance •Risk and Compliance Management
•Monitor uptime, response times, errors, and resource usage. SOCaaS offers several benefits:
•Use tools like Datadog, New Relic, AWS CloudWatch, or Azure Monitor. •24/7 Monitoring: Continuous surveillance of IT environments to detect and respond
3. Security Management to threats promptly.
•Implement access control (IAM), data encryption, and firewalls. •Expertise Access: Leverage specialized security professionals without the need for
•Regular vulnerability scanning and security patching. in-house hiring.
•Ensure compliance with regulations (e.g., GDPR, HIPAA). •Scalability: Easily adjust security services based on organizational needs.
4. Scaling and Load Management •Cost-Effectiveness: Reduce the financial burden of maintaining an in-house SOC.
•Use auto-scaling to handle traffic spikes. Advantages of SOA:
•Employ load balancers to distribute user traffic evenly across servers. Easy maintenance: As services are independent of each other they can be updated
5. Backup and Disaster Recovery and modified easily without affecting other services.
•Automate regular data backups. Platform independent: SOA allows making a complex application by combining
•Define and test disaster recovery plans. services picked from different sources, independent of the platform.
Availability: SOA facilities are easily available to anyone on request.
Reliability: SOA applications are more reliable because it is easy to debug small
services rather than huge codes
Scalability: Services can run on different servers within an environment, this
increases scalability Web 1.0 refers to the first stage of the World Wide Web evolution. Earlier, there
fundamental driver for cloud computing- Multicore technology, where a single were only a few content creators in Web 1.0 with a huge majority of users who are
processor chip contains multiple independent processing units (cores), is a consumers of content. Personal web pages were common, consisting mainly of static
fundamental driver for cloud computing. pages hosted on ISP-run web servers, or free web hosting services. In Web 1.0
* Parallelism: Enables multiple tasks/VMs/containers to run simultaneously on a advertisements on websites while surfing the internet are banned. Also, in Web 1.0,
single CPU, boosting overall throughput. Ofoto is an online digital photography website, on which users could store, share,
* Efficiency: Maximizes hardware utilization, allowing cloud providers to run more view, and print digital pictures. Web 1.0 is a content delivery network (CDN) that
workloads on fewer physical servers. enables the showcase of the piece of information on the websites. It can be used as
* Scalability & Density: Facilitates hosting a greater number of virtual instances per a personal website. It costs the user as per pages viewed. It has directories that
server, improving density and contributing to both vertical and horizontal scaling. enable users to retrieve a particular piece of information. The era of Web 1.0 was
* Cost Savings: Reduces the need for physical hardware, leading to lower capital and roughly from 1991 to 2004.
operational expenses (power, cooling, maintenance). Four Design Essentials of a Web 1.0 Site Include:
* Performance: Provides the raw processing power needed for demanding cloud >Static pages.
applications and services. >Content is served from the server’s file system.
* Enabler for Virtualization/Containerization: Crucial for hypervisors and container >Pages built using Server Side Includes or Common Gateway Interface (CGI).
orchestration to efficiently manage and distribute workloads across available >Frames and Tables are used to position and align the elements on a page.
processing units. Features of the Web 1.0
Cloud Storage Architecture -Easy to connect static pages with the system via hyperlinks
Cloud storage architecture is the framework that makes it possible for users and -Supports elements like frames and tables with HTML 3.2
organizations to store, manage, and access their data online. It’s designed to handle -Also has graphics and a GIF button
massive amounts of data while ensuring security, accessibility, and scalability. Let’s -Less interaction between the user and the server
break down the key parts of this system: -You can send HTML forms via mail
Main Components of Cloud Storage Architecture -Provides only a one-way publishing medium
1. Frontend Layer:This is what users interact with—essentially, the interface. It could What is Web 2.0? 2004 When the word Web 2.0 become famous due to the First
be through APIs, a web dashboard, or software that connects to the storage system. Web 2.0 conference (later known as the Web 2.0 summit) held by Tim O'Reilly and
It also manages who gets access to the data by handling authentication and Dale Dougherty, the term was coined by Darcy DiNucci in 1999. Web 2.0 refers to
permissions. worldwide websites which highlight user-generated content, usability, and
2. Backend Layer: This is where all the heavy lifting happens: interoperability for end users. Web 2.0 is also called the participative social web. It
Storage Types: does not refer to a modification to any technical specification, but to modify the way
Object Storage: Best for unstructured data like media files, backups, or logs (think Web pages are designed and used. The transition is beneficial but it does not seem
Amazon S3 or Google Cloud Storage). that when the changes occur. Interaction and collaboration with each other are
Block Storage: This is faster and used for applications like databases (e.g., Amazon allowed by Web 2.0 in a social media dialogue as the creator of user-generated
EBS, Azure Disk Storage). content in a virtual community. Web 2.0 is an enhanced version of Web 1.0.
File Storage: Works like a shared drive and is great for file hierarchies (e.g., Amazon Features of the Web 2.0
EFS, Azure Files). -Free sorting of information, permits users to retrieve and classify the information
Metadata: Keeps track of information about the data—like file names, sizes, and collectively.
access rules. -Dynamic content that is responsive to user input.
3. Control Layer:This part manages how everything operates: -Information flows between the site owner and site users using evaluation & online
Orchestration: Ensures resources (like storage space or processing power) are commenting.
distributed efficiently. -Developed APIs to allow self-usage, such as by a software application.
Monitoring: Keeps track of performance, usage, and security. -Web access leads to concerns different, from the traditional Internet user base to a
Lifecycle Management: Automatically handles tasks like archiving or deleting old wider variety of users.
data based on pre-set rules. What is Web 3.0?: It refers to the evolution of web utilization and interaction which
4. Network Layer: This handles data movement between the user and the storage includes altering the Web into a database, with the integration of DLT (Distributed
system. Protocols like HTTPS or APIs enable this. Sometimes, a Content Delivery Ledger Technology blockchain is an example) and that data can help to make Smart
Network (CDN) is used to speed up access by caching data in locations closer to Contracts based on the needs of the individual. It enables the up-gradation of the
users. backend of the web, after a long time of focusing on the frontend (Web 2.0 has
Cloud Storage Requirements :To implement and manage cloud storage effectively, mainly been about AJAX, tagging, and other front-end user-experience innovation).
certain technical and business requirements must be met. These ensure data Web 3.0 is a term that is used to describe many evolutions of web usage and
availability, security, scalability, and compliance. interaction among several paths. In this, data isn’t owned but instead shared but still
✅ 1. Scalability is, where services show different views for the same web / the same data.
•Storage must grow seamlessly with data volume. Features of the Web 3.0
•Support for elastic scaling (both up and down) as needed. Artificial Intelligence: Combining this capability with natural language processing, in
✅ 2. Data Availability & Reliability Web 3.0, computers can distinguish information like humans to provide faster and
•Ensure high uptime (e.g. 99.9%+). more relevant results. They become more intelligent to fulfill the requirements of
•Use redundancy (e.g. data replication across regions/zones). users.
•Implement automatic backups and disaster recovery. 3D Graphics: The three-dimensional design is being used widely in websites and
✅ 3. Security services in Web 3.0. Museum guides, computer games, e-commerce, geospatial
•Data Encryption: Both at rest and in transit. contexts, etc. are all examples that use 3D graphics.
•Access Controls: Role-Based Access Control (RBAC), Identity & Access Management Connectivity: With Web 3.0, information is more connected thanks to semantic
•Audit Trails: Logging all access and changes. metadata. As a result, the user experience evolves to another level of connectivity
✅ 4. Performance that leverages all the available information.
•Support fast read/write speeds, especially for high-performance workloads. What Is Web 4.0?: Web 4.0 represents the next evolution of the Internet where
•Options for different storage types (e.g., SSD, HDD, object vs block storage). artificial intelligence, machine learning and advanced technologies work together to
✅ 5. Interoperability create a smarter, more intuitive online experience. Unlike previous versions of the
•Compatible with various platforms, tools, and APIs. web which focused primarily on connectivity and user-generated content. Web 4.0 is
•Support for common protocols (e.g., S3, NFS, FTP, SMB). all about understanding and anticipating user needs. It aims to make the Internet not
✅ 6. Cost Management just a tool but a partner in our daily lives.
•Transparent pricing models (pay-as-you-go or tiered). Key Features of Web 4.0
•Lifecycle policies to move infrequently used data to cheaper storage. Artificial Intelligence at the Core: AI will be the backbone of Web 4.0. Machine
✅ 7. Compliance & Legal Requirements learning algorithms will process data faster than ever before, enabling websites and
•Meet standards like GDPR, HIPAA, ISO 27001, etc. apps to predict what users want before they even ask. For example if you're
•Data residency and sovereignty (control over data location). shopping online, the platform might suggest products tailored specifically to your
✅ 8. Ease of Management tastes or recommend deals based on your past purchases.
•Centralized dashboards or APIs for monitoring and control. Natural Language Processing (NLP): One of the most exciting aspects of Web 4.0 is its
•Automation for backups, scaling, and policy enforcement. ability to understand human language. Advanced NLP allow chatbots and virtual
✅ 9. Backup and Disaster Recovery assistants to communicate with users in a way that feels natural and conversational.
•Scheduled backups, point-in-time recovery, geo-replication. Instead of rigid commands, you'll be able to speak or type casually and the system
•Versioning and soft-delete features to recover from accidental loss. will respond intelligently.
✅ 10. Data Integrity and Durability Internet of Things (IoT): Integration: Web 4.0 will seamlessly connect the digital
•Checksum and validation mechanisms. world with the physical world through IoT. Smart devices like thermostats,
•Durability guarantees (e.g., AWS S3 promises 99.999999999% durability) refrigerators, cars and even clothing will share data and work together to enhance
convenience and efficiency. For instance your smartwatch could alert your car to
start warming up when it detects cold weather while your home adjusts the heating Go runtime 1.2 environment
automatically.
The Agile Model was primarily designed to help a project adapt quickly to change
requests. So, the main aim of the Agile model is to facilitate quick project What are IBM Cloud Computing APIs?
completion. To accomplish this task, it's important that agility is required. Agility is * Application Programming Interfaces (APIs): At their core, IBM Cloud APIs are sets
achieved by fitting the process to the project and removing activities that may not be of defined rules and protocols that enable different software applications to
essential for a specific project.Also, anything that is a waste of time and effort is communicate and exchange data. This means you can control and manage your IBM
avoided. The Agile Model refers to a group of development processes. These Cloud resources (like virtual servers, databases, AI services, etc.) without needing to
processes share some basic characteristics but do have certain subtle differences directly interact with the IBM Cloud console.
among themselves. * Comprehensive Coverage: IBM Cloud offers APIs for almost all its services. This
Steps in the Agile Model includes infrastructure services (compute, storage, networking), platform services
The Agile Model is a combination of iterative and incremental process models. The (databases, messaging, analytics), AI and machine learning services, security services,
phases involve in Agile (SDLC) Model are: and many more.
Requirement Gathering Design the Requirements * RESTful Design: Most IBM Cloud APIs follow a RESTful architecture, which means
Construction / Iteration Testing / Quality Assurance they use standard HTTP methods (GET, POST, PUT, DELETE) to perform operations
Deployment Feedback on resources, making them relatively easy to understand and use.
: What is the Agile Software Development Life Cycle (Agile SDLC)? * API Connect: IBM provides "API Connect" as an enterprise-grade platform for
The Agile Software Development Life Cycle (SDLC) is an iterative and incremental creating, securing, managing, sharing, monetizing, and analyzing custom APIs, both
software development methodology that prioritizes flexibility, collaboration, and on-premises and on the cloud. This is particularly useful for organizations that want
customer feedback. Unlike traditional SDLC models, such as the waterfall model, to expose their own internal services as APIs or manage a large portfolio of APIs.
which completes each step sequentially, the agile SDLC divides the development Unit 4
process into smaller iterations or increments. What is Virtualization? Virtualization is the process of creating a virtual
The major factors of agile, according to Agile Manifesto, are following four: representation of hardware such as server, storage, network or other physical
Early customer involvement Iterative development machines. It Supports multiple copies of virtual machines(VMs) to execute on one
Self-organizing teams Adaptation to change physical machine each with their own operating system and programs. This
Steps of Agile SDLC Model -The agile model is a combination of iterative and optimizes hardware efficiency and flexibility and enables resources to be shared
incremental process models. The steps involve in agile SDLC models are: between multiple customers or organizations.
Requirement gathering Design the Requirements Virtualization is a key to providing Infrastructure as a Service (IaaS) solutions for
Coding Testing cloud computing, whereby the user has access to remote computing resources.
Deployment Feedback Full Virtualization: Full virtualization is a virtualization technique used to provide a
Agile Software Development Cycle VME that completely simulates the underlying hardware. In this type of
Step 1: In the first step, concept, and business opportunities in each possible project environment, any software capable of execution on the physical hardware can be
are identified and the amount of time and work needed to complete the project is run in the VM, and any OS supported by the underlying hardware can be run in each
estimated. Based on their technical and financial viability, projects can then be individual VM. Users can run multiple different guest OSes simultaneously. In full
prioritized and determined which ones are worthwhile pursuing. virtualization, the VM simulates enough hardware to allow an unmodified guest OS
Step 2: In the second phase, known as inception, the customer is consulted regarding to be run in isolation. This is particularly helpful in a number of situations. For
the initial requirements, team members are selected, and funding is secured. example, in OS development, experimental new code can be run at the same time as
Additionally, a schedule outlining each team's responsibilities and the precise time at older versions, each in a separate VM. The hypervisor provides each VM with all the
which each sprint's work is expected to be finished should be developed. services of the physical system, including a virtual BIOS, virtual devices, and
Step 3: Teams begin building functional software in the third step, virtualized memory management. The guest OS is fully disengaged from the
iteration/construction, based on requirements and ongoing feedback. Iterations, also underlying hardware by the virtualization layer. Full virtualization is achieved by
known as single development cycles, are the foundation of the Agile software using a combination of binary translation and direct execution. With full
development cycle. virtualization hypervisors, the physical CPU executes nonsensitive instructions at
Cloud computing applications develops by leveraging platforms and frameworks. native speed; OS instructions are translated on the fly and cached for future use, and
Various types of services are provided from the bare metal infrastructure to user level instructions run unmodified at native speed. Full virtualization offers the
customize-able applications serving specific purposes.Amazon Web Services (AWS) - best isolation and security for VMs and simplifies migration and portability as the
AWS provides different wide-ranging clouds IaaS services, which ranges from virtual same guest OS instance can run on virtualized or native hardware. Figure 1.5 shows
compute, storage, and networking to complete computing stacks. AWS is well known the concept behind full virtualization.
for its storage and compute on demand services, named as Elastic Compute Cloud Paravirtualization is ideal for specific use cases where performance and efficiency
(EC2) and Simple Storage Service (S3). EC2 offers customizable virtual hardware to are critical and where there is flexibility to modify or adapt the guest operating
the end user which can be utilize as the base infrastructure for deploying computing system. Here are some scenarios where paravirtualization is well-suited:
systems on the cloud. It is likely to choose from a large variety of virtual hardware Performance Optimization: Paravirtualization often results in better performance
configurations including GPU and cluster instances. Either the AWS console, which is than full virtualization. Allowing the guest OS to communicate directly with the
a wide-ranged Web portal for retrieving AWS services, or the web services API hypervisor through hypercalls which reduces the overhead associated with
available for several programming language is used to deploy the EC2 instances. emulation, leading to improved performance.
How does Microsoft Azure Work?:It is a private and public cloud platform that helps I/O-Intensive Workloads: Paravirtualization is particularly beneficial for I/O-intensive
developers and IT professionals build, deploy, and manage applications. It uses the workloads, such as database servers and storage-intensive applications. The direct
technology known as virtualization. Virtualization separates the tight coupling communication between the guest and the hypervisor enhances I/O performance.
between the hardware and the operating system using an abstraction layer called a Customized Operating Systems: Paravirtualization requires modification of the guest
hypervisor. Hypervisor emulates all the functions of a computer in a virtual machine; OS to be aware of the virtualization layer. This makes it suitable for scenarios where
it can run multiple virtual machines at the same time, and each virtual machine can customization and adaptation of the operating system are feasible, such as in
run any operating system, such as Windows or Linux. development or specialized environments.
Azure takes this virtualization technique and repeats it on a massive scale in the data Resource Efficiency: Paravirtualization can lead to more efficient resource utilization
center owned by Microsoft. Each data center has many racks filled with servers and compared to full virtualization. It allows for greater control over resource allocation
each server includes a hypervisor to run multiple virtual machines. The network and can be beneficial in environments where optimizing resource usage is a priority.
switch provides connectivity to all those servers. Embedded Systems and Real-Time Applications: In situations where real-time
: Google App Engine (GAE)? A scalable runtime environment, Google App Engine is performance is critical, paravirtualization can be a suitable choice. It provides low-
mostly used to run Web applications. These dynamic scales as demand change over latency communication between the guest and the hypervisor, making it suitable for
time because of Google's vast computing infrastructure. Because it offers a secure real-time applications and embedded systems.
execution environment in addition to a number of services, App Engine makes it : A hypervisor has a simple user interface that needs some storage space. It exists
easier to develop scalable and high-performance Web apps. as a thin layer of software and to establish a virtualization management layer, it does
The App Engine SDK facilitates the testing and professionalization of applications by hardware management function. For the provisioning of virtual machines, device
emulating the production runtime environment and allowing developers to design drivers and support software are optimized while many standard operating system
and test applications on their own PCs. When an application is finished being functions are not implemented. Essentially, to enhance performance overhead
produced, developers can quickly migrate it to App Engine, put in place quotas to inherent to the coordination which allows multiple VMs to interact with the same
control the cost that is generated, and make the programmer available to everyone. hardware platform this type of virtualization system is used.
Python, Java, and Go are among the languages that are currently supported. Hardware compatibility is another challenge for hardware-based virtualization. The
Features of App Engine virtualization layer interacts directly with the host hardware, which results that all
Runtimes and Languages:To create an application for an app engine, you can use Go, the associated drivers and support software must be compatible with the hypervisor.
Java, PHP, or Python. You can develop and test an app locally using the SDK's As hardware devices drivers available to other operating systems may not be
deployment toolkit. Each language's SDK and nun time are unique. Your program is available to hypervisor platforms similarly. Moreover, host management and
run in a: administration features may not contain the range of advanced functions that are
Java Run Time Environment version 7 common to the operating systems.
Python Run Time environment version 2.7
PHP runtime's PHP 5.4 environment
It allows communication between nodes in a virtual network without routing of
frames.
It restricts management traffic.
It enforces routing for communication between virtual networks.
.
. Common Cloud Computing Challenges: Cloud computing is the provisioning of
What is SQS (Amazon Simple Queue Service) In AWS? resources like data and storage on demand, that is, in real-time. It has been proven
Amazon Simple Queue Service (SQS) will let you send messages, store the messages, to be revolutionary in the IT industry, with the market valuation growing at a rapid
and receive messages between various software components at any amount, rate. Cloud development has proved to be beneficial not only for huge public and
without losing of actual messages. Also, without requiring some other services to be private enterprises but small-scale businesses as well as it helps to cut costs. It is
available. so, basically, Amazon SQS is a distributed queue system. estimated that more than 94% of businesses will increase their spending on the
Queues are used to store textual information so it can be received and used by a cloud by more than 45%. This has also resulted in more and higher-paying jobs if you
consumer later on. The consumer is something that is getting a message from a are a cloud developer.
queue. It can be anything that is able to make an API call to SQS (application, 1. Data Security and Privacy: Data security is a major concern when switching to
microservice, human, etc....). Using this paradigm we implement decoupling. cloud computing. User or organizational data stored in the cloud is critical and
The decoupling allows the processing of incoming requests later on. So when the private. Even if the cloud service provider assures data integrity, it is your
consumer is overloaded, it waits before getting another message. This way our responsibility to carry out user authentication and authorization, identity
applications become more fault-tolerant. Amazon Simple Queue Service(SQS) is a management, data encryption, and access control. Security issues on the cloud
fully managed queue service in the AWS cloud. include identity theft, data breaches, malware infections, and a lot more which
AWS SQS Architecture eventually decrease the trust amongst the users of your applications. This can in turn
In a distributed messaging system, three primary elements are involved: the system lead to potential loss in revenue alongside reputation and stature.
components, the queue (which is distributed across Amazon SQS servers), and the 2. Cost Management: Even as almost all cloud service providers have a "Pay As You
messages stored within the queue. In this scenario, your system includes multiple Go" model, which reduces the overall cost of the resources being used, there are
producers (which send messages to the queue) and consumers (which retrieve times when there are huge costs incurred to the enterprise using cloud computing.
messages from the queue). The queue stores messages (such as A, B, C, D, and E) When there is under optimization of the resources, let's say that the servers are not
across multiple Amazon SQS servers, ensuring redundancy and high availability. being used to their full potential, add up to the hidden costs.
How does Microsoft Azure Work? 3. Multi-Cloud Environments: Due to an increase in the options available to the
It is a private and public cloud platform that helps developers and IT professionals companies, enterprises not only use a single cloud but depend on multiple cloud
build, deploy, and manage applications. It uses the technology known as service providers. Most of these companies use hybrid cloud tactics and close to 84%
virtualization. Virtualization separates the tight coupling between the hardware and are dependent on multiple clouds.
the operating system using an abstraction layer called a hypervisor. Hypervisor 4. Performance Challenges: Performance is an important factor while considering
emulates all the functions of a computer in a virtual machine; it can run multiple cloud-based solutions. If the performance of the cloud is not satisfactory, it can drive
virtual machines at the same time, and each virtual machine can run any operating away users and decrease profits. Even a little latency while loading an app or a web
system, such as Windows or Linux. page can result in a huge drop in the percentage of users.
Azure takes this virtualization technique and repeats it on a massive scale in the data 6. High Dependence on Network:Since cloud computing deals with provisioning
center owned by Microsoft. Each data center has many racks filled with servers and resources in real-time, it deals with enormous amounts of data transfer to and from
each server includes a hypervisor to run multiple virtual machines. The network the servers. This is only made possible due to the availability of the high-speed
switch provides connectivity to all those servers. network.
What is Microsoft Assessment and Planning (MAP) Toolkit?
Microsoft Assessment and Planning (MAP) Toolkit is a free utility IT can use to
determine whether its infrastructure is prepared for a migration to a new operating
system, server version or cloud-based deployment.
An IT professional can run MAP Toolkit on their device and take an inventory of the
devices, software, users and infrastructure associated with any networks they are
connected to. Microsoft now recommends that customers use Azure Migrate rather
than the MAP toolkit.
Microsoft Assessment and Planning Toolkit is made up of four main components,
as follows:
MAPSetup.exe contains MAP as well as the files IT administrators need to set up a
local SQL Server Database Engine.
readme_en.htm details what administrators need to run MAP Toolkit and known
issues.
MAP_Sample_Documents.zip provides examples of the types of reports and
proposals MAP Toolkit creates.
MAP_Training_Kit.zip explains how to use MAP Toolkit and provides a sample
database of the information MAP Toolkit can provide.
"IBM SmartCloud" was IBM's branding for its comprehensive suite of cloud
computing offerings that launched in 2011. It represented IBM's initial major push
into the cloud market, encompassing various services across different cloud models.
Essentially, IBM SmartCloud aimed to provide:
* Infrastructure as a Service (IaaS): Like virtual servers (VMs) and storage, known as
IBM SmartCloud Enterprise and IBM SmartCloud Enterprise+. This allowed
businesses to rent computing power and storage on demand.
* Platform as a Service (PaaS): For developers to build, run, and manage applications
without managing the underlying infrastructure.
* Software as a Service (SaaS): Pre-built, ready-to-use applications delivered over
the internet, such as IBM SmartCloud Notes (email) and IBM SmartCloud Docs
(collaboration).
* Hybrid Cloud Capabilities: Tools and solutions to help enterprises integrate their
on-premises IT infrastructure with IBM's public cloud.
* Management and Orchestration: Services like IBM SmartCloud Application
Performance Management and IBM SmartCloud Orchestrator focused on managing
and automating cloud environments and applications.
Current Status:
While "IBM SmartCloud" was a significant brand in the early 2010s, IBM has since
evolved its cloud strategy. The offerings under the "SmartCloud" banner have largely
been integrated into, or superseded by, the broader IBM Cloud platform.
Today, if you're looking for IBM's cloud services, you'd explore IBM Cloud, which is a
more unified and expanded platform offering a wider range of services including