Cloud Computing Important Questions From Pyp and MQP
Cloud Computing Important Questions From Pyp and MQP
1. Differentiate between cluster, grid, and cloud computing. Discuss the key enabling
technologies used in network-based systems.
Several clients can simultaneously access different services, storage, and applications via
the Internet.
3. Grid
The significant difference between cloud and grid computing is that grid computing solves
complicated tasks, but cloud computing provides users access to some particular
services at a low cost.
4. Cluster
Now let’s take a look at the core differences between grid, cloud, and cluster:
Consolidation of
Segregation of resources Aggregation of resources
resources
Single system made up Collection of systems that act Group of nodes that are
of many systems together like a single system connected to each other
Follows centralized
Follows distributed architecture Follows centralized architecture
architecture
Job execution is self- Scalability of execution allows The scheduling of jobs affects
managed for the transfer of a job’s execution. Jobs, therefore, wait
Cloud Grid Cluster
execution to an available
until their designated runtime
processor
2. Describe the software environments used for distributed systems and cloud platforms.
3. Describe the different system models for distributed and cloud computing.
System models describe how computing systems are organized, structured, and interact. In the
context of distributed and cloud computing, the primary models include:
1. Architectural Models
a) Client-Server Model
Description: A central server provides resources/services to multiple client machines.
Example: Web browsers (clients) interacting with a web server.
Use Case: Traditional web applications, file servers.
Description: Each node acts as both a client and a server. Resources are shared among peers.
Example: BitTorrent, blockchain networks.
Use Case: File sharing, decentralized applications.
Description: Functions are separated into layers: presentation, application logic, and data
storage.
Example: A 3-tier web application with front-end, business logic, and database layers.
Use Case: Enterprise applications, web-based systems.
Description: Services are loosely coupled and communicate via standard protocols (e.g.,
HTTP, SOAP).
Example: Microservices architecture in cloud apps.
Use Case: Scalable, flexible cloud applications.
2. Computing Models
a) Cluster Computing
b) Grid Computing
c) Cloud Computing
Description: On-demand access to shared computing resources (IaaS, PaaS, SaaS) over the
internet.
Use Case: Web hosting, storage services, AI model deployment.
3. Virtualization Models
Entire hardware is emulated, allowing multiple OSes to run independently (e.g., VMware).
b) Paravirtualization
Guest OS is aware of virtualization and interacts with the hypervisor directly (e.g., Xen).
c) Containerization
Lightweight, OS-level virtualization where containers share the host OS kernel (e.g., Docker).
Raw computing resources like servers and storage (e.g., AWS EC2).
Platform for application development and deployment (e.g., Google App Engine).
Fully functional software delivered over the internet (e.g., Gmail, Microsoft 365).
What is Virtualization?
Virtualization is important because it let's you get the most out of your computer or server
resources. Consider it like being able to use one physical box as many smaller, independent
"virtual" boxes. There are multiple virtual boxes, each having its own program to run and
data to store, but they use the same physical box.
Instead of allowing for numerous unused machines, virtualization enables you to host
multiple programs or systems on one computer, which is more effective.
2. Cost Utilization
Companies can save their money on hardware, power, and maintenance by using less
physical equipment.
3. Flexibility
Virtual machines can be easily installed, relocated and resized to suit changing
requirements. If a virtual machine requires more power, it can obtain it rapidly without
requiring new hardware.
4. Security
Virtualization isolates various applications or systems from each other, so if one of them has
an issue, it won't affect others.
5. Simple Recovery
In case something goes wrong, it's simple to back up or restore virtual machines, allowing
companies to return to work quickly after an issue.
Computing environments refer to the technology infrastructure and software platforms that
are used to develop, test, deploy, and run software applications. There are several types of
computing environments, including:
1. Mainframe: A large and powerful computer system used for critical applications
and large-scale data processing.
2. Client-Server: A computing environment in which client devices access
resources and services from a central server.
3. Cloud Computing: A computing environment in which resources and services
are provided over the Internet and accessed through a web browser or client
software.
4. Mobile Computing: A computing environment in which users access information
and applications using handheld devices such as smartphones and tablets.
5. Grid Computing: A computing environment in which resources and services are
shared across multiple computers to perform large-scale computations.
6. Embedded Systems: A computing environment in which software is integrated
into devices and products, often with limited processing power and memory.
Each type of computing environment has its own advantages and disadvantages, and the
choice of environment depends on the specific requirements of the software application
and the resources available.
In the world of technology where every tasks are performed with help of computers, these
computers have become one part of human life. Computing is nothing but process of
completing a task by using this computer technology and it may involve computer
hardware and/or software. But computing uses some form of computer system to manage,
process, and communicate information. After getting some idea about computing now lets
understand about computing environments.
Computing Environments : When a problem is solved by the computer, during that
computer uses many devices, arranged in different ways and which work together to solve
problems. This constitutes a computing environment where various number of computer
devices arranged in different ways to solve different types of problems in different ways. In
different computing environments computer devices are arranged in different ways and
they exchange information in between them to process and solve problem. One computing
environment consists of many computers other computational devices, software and
networks that to support processing and sharing information and solving task. Based on
the organization of different computer devices and communication processes there exists
multiple types of computing environments.
Now lets know about different types of computing environments.
Types of Computing Environments : There are the various types of computing
environments. They are :
1. Mainframe: High cost and complexity, with a significant learning curve for
developers.
2. Client-Server: Dependence on network connectivity, and potential security risks
from centralized data storage.
3. Cloud Computing: Dependence on network connectivity, and potential security
and privacy concerns.
4. Mobile Computing: Limited processing power and memory compared to other
computing environments, and potential security risks.
5. Grid Computing: Complexity in setting up and managing the grid infrastructure.
6. Embedded Systems: Limited processing power and memory, and the need for
specialized skills for software development
Cluster Computing
Classification of Cluster :
1. Open Cluster :
IPs are needed by every node and those are accessed only through the internet or web.
This type of cluster causes enhanced security concerns.
2. Close Cluster :
The nodes are hidden behind the gateway node, and they provide increased protection.
They need fewer IP addresses and are good for computational tasks.
Cluster Components
1. High Performance :
The systems offer better and enhanced performance than that of mainframe computer
networks.
2. Easy to manage :
Cluster Computing is manageable and easy to implement.
3. Scalable :
Resources can be added to the clusters accordingly.
4. Expandability :
Computer clusters can be expanded easily by adding additional computers to the network.
Cluster computing is capable of combining several additional resources or the networks to
the existing computer system.
5. Availability :
The other nodes will be active when one node gets failed and will function as a proxy for
the failed node. This makes sure for enhanced availability.
6. Flexibility :
It can be upgraded to the superior specification or additional nodes can be added.
7. Write short notes on peer-to-peer network families and their relevance to cloud systems.
The P2P process deals with a network structure where any participant in the network
known as a node acts as both a client and a server. This means that, rather than relying on
a basis server to supply resources or services, everybody from the network of nodes can
trade resources and services with one another. In a P2P system, every node has an equal
role to play and the same functionalities, which means that the loads are well shared.
A peer-to-peer network is a simple network of computers. It first came into existence in the
late 1970s. Here each computer acts as a node for file sharing within the formed network.
Here each node acts as a server and thus there is no central server in the network. This
allows the sharing of a huge amount of data. The tasks are equally divided amongst the
nodes. Each node connected in the network shares an equal workload. For the network to
stop working, all the nodes need to individually stop working. This is because each node
works independently.
Unstructured P2P Networks: In this type of P2P network, each device is able to
make an equal contribution. This network is easy to build as devices can be
connected randomly in the network. But being unstructured, it becomes difficult to
find content. For example, Napster, Gnutella, etc.
Structured P2P Networks: It is designed using software that creates a virtual layer
in order to put the nodes in a specific structure. These are not easy to set up but
can give easy access to users to the content. For example, P-Grid, Kademlia, etc.
Hybrid P2P Networks: It combines the features of both P2P networks and client-
server architecture. An example of such a network is to find a node using the
central server.
These networks do not involve a large number of nodes, usually less than 12. All
the computers in the network store their own data but this data is accessible by the
group.
Unlike client-server networks, P2P uses resources and also provides them. This
results in additional resources if the number of nodes increases. It requires
specialized software. It allows resource sharing among the network.
Since the nodes act as clients and servers, there is a constant threat of attack.
In the P2P network architecture, the computers connect with each other in a workgroup to
share files, and access to internet and printers.
Each computer in the network has the same set of responsibilities and capabilities.
Each computer in the network has the ability to share data with other computers in
the network.
P2P Architecture
Let's understand the working of the Peer-to-Peer network through an example. Suppose,
the user wants to download a file through the peer-to-peer network then the download will
be handled in this way:
If the peer-to-peer software is not already installed, then the user first has to install
the peer-to-peer software on his computer.
The data is also sent from the user's computer to other computers in the network
that ask for the data that exist on the user's computer.
Thus, it can be said that in the peer-to-peer network the file transfer load is distributed
among the peer computers.
Firstly secure your network via privacy solutions. Below are some of the measures to keep
the P2P network secure:
Share and Download Legal Files: Double-check the files that are being downloaded before
sharing them with other employees. It is very important to make sure that only legal files are
downloaded.
Design Strategy for Sharing: Design a strategy that suits the underlying architecture in order
to manage applications and underlying data.
Keep Security Practices Up-to-Date: Keep a check on the cyber security threats which might
prevail in the network. Invest in good quality software that can sustain attacks and prevent
the network from being exploited. Update your software regularly.
Scan All Downloads: This is used to constantly check and scan all the files for viruses before
downloading them. This helps to ensure that safe files are being downloaded and in case,
any file with potential threat is detected then report to the IT Staff.
Proper Shutdown of P2P Networking After Use: It is very important to correctly shut down
the software to avoid unnecessary access to third persons to the files in the network. Even if
the windows are closed after file sharing but the software is still active then the
unauthorized user can still gain access to the network which can be a major security breach
in the network.
P2P services have gained widespread popularity due to their decentralized nature, which eliminates
the need for a central authority or server. It allows users to directly interact with each other, reducing
costs by sharing resources like bandwidth and storage. P2P networks also offers better performance
for certain tasks, such as file sharing, where data can be retrieved from multiple sources.
File Sharing: P2P network is the most convenient, cost-efficient method for file sharing for
businesses. Using this type of network there is no need for intermediate servers to transfer
the file.
Blockchain: The P2P architecture is based on the concept of decentralization. When a peer-
to-peer network is enabled on the blockchain it helps in the maintenance of a complete
replica of the records ensuring the accuracy of the data at the same time. At the same time,
peer-to-peer networks ensure security also.
Direct Messaging: P2P network provides a secure, quick, and efficient way to communicate.
This is possible due to the use of encryption at both the peers and access to easy messaging
tools.
Collaboration: The easy file sharing also helps to build collaboration among other peers in
the network.
File Sharing Networks: Many P2P file sharing networks like G2, and eDonkey have
popularized peer-to-peer technologies.
Content Distribution: In a P2P network, unline the client-server system so the clients can
both provide and use resources. Thus, the content serving capacity of the P2P networks can
actually increase as more users begin to access the content.
The first level is the basic level which uses a USB to create a P2P network between two
systems.
The second is the intermediate level which involves the usage of copper wires in order to
connect more than two systems.
The third is the advanced level which uses software to establish protocols in order to manage
numerous devices across the internet.
Some of the popular P2P networks are Gnutella, BitTorrent, eDonkey, Kazaa, Napster, and Skype.
Easy to Maintain: The network is easy to maintain because each node is independent of the
other.
Less Costly: Since each node acts as a server, therefore the cost of the central server is saved.
Thus, there is no need to buy an expensive server.
No Network Manager: In a P2P network since each node manages his or her own computer,
thus there is no need for a network manager.
Adding Nodes is Easy: Adding, deleting, and repairing nodes in this network is easy.
Less Network Traffic: In a P2P network, there is less network traffic than in a client/ server
network.
Data is Vulnerable: Because of no central server, data is always vulnerable to getting lost
because of no backup.
Less Secure: It becomes difficult to secure the complete network because each node is
independent.
Slow Performance: In a P2P network, each computer is accessed by other computers in the
network which slows down the performance of the user.
Files Hard to Locate: In a P2P network, the files are not centrally stored, rather they are
stored on individual computers which makes it difficult to locate the files.
13. Discuss system attacks and threats to cyberspace resulting in different types of losses.
14. Explain the Platform Evolution of different computer technologies with a neat diagram.
15. Outline eight reasons to adopt the cloud for upgraded Internet applications and web
services.
18. Illustrate various system attacks and network threats to cyberspace, resulting in four types of
losses with a neat diagram.
1. Explain in detail about implementation levels of virtualization. Also, briefly discuss how
virtualization helps in automating data center operations.
2. Explain how migration of memory, files, and network resources happen in cloud computing.
8. Discuss the requirement of OS-level virtualization. Also state the advantages and
disadvantages of OS extensions.
9. Explain Full Virtualization and Para Virtualization.
11. Write steps for creating a virtual machine: configure and deploy a virtual machine with
specific CPU and memory requirements in Google Cloud OR write 5 commands and explore
AWS Cloud Shell.
12. Describe the concept of virtual clusters and resource management in data centers.
13. Explain the virtualization of I/O devices and its importance in cloud computing.
14. List and explain any four virtualization tools (e.g., VMware, Xen, KVM, Hyper-V, VirtualBox).
15. Describe the role of a hypervisor. Compare Type-1 and Type-2 Hypervisors.
1. Discuss IaaS, PaaS, and SaaS cloud service models at different service levels.
6. Write a note on the basic requirements for managing resources of a data center.
10. Compare the cloud services offered by Google App Engine (GAE), AWS, and Azure.
11. Explain the concept of inter-cloud resource management and its challenges.
12. Explain the architectural design of compute and storage clouds with a neat diagram.
13. Describe the data center network topologies and their significance.
14. With a neat diagram, build a cloud ecosystem with a private cloud.
4. Select four widely accepted fair information practices that consumer-oriented commercial
websites collecting personal data must comply with.
8. Explain the security risks faced by cloud users and cloud service providers.
12. Explain security risks posed by shared images and management OS.
14. What is Xoar? Discuss its role in providing trusted hypervisor support in cloud security.
15. Describe the process and significance of a privacy impact assessment in cloud computing.
16. Discuss various encryption techniques used for securing data in the cloud.
1. What are the various system issues for running a typical parallel program in either a
distributed or parallel manner?
2. With a neat diagram, explain the data flow in running a MapReduce job at various task
trackers using Hadoop Library.
12. Explain the steps involved in deploying a web application on App Engine with automatic
scaling.
13. Explain the key features of cloud and grid computing platforms and compare them.