Advantages of Cloud Computing
As we all know that Cloud computing is trending technology. Almost every company switched their
services on the cloud to rise the company growth.
Here, we are going to discuss some important advantages of Cloud Computing-
1) Back-up and restore data
Once the data is stored in the cloud, it is easier to get back-up and restore that data using the cloud.
2) Improved collaboration
Cloud applications improve collaboration by allowing groups of people to quickly and easily share
information in the cloud via shared storage.
3) Excellent accessibility
Cloud allows us to quickly and easily access store information anywhere, anytime in the whole world,
using an internet connection. An internet cloud infrastructure increases organization productivity and
efficiency by ensuring that our data is always accessible.
4) Low maintenance cost
Cloud computing reduces both hardware and software maintenance costs for organizations.
5) Mobility
Cloud computing allows us to easily access all cloud data via mobile.
6) IServices in the pay-per-use model
Cloud computing offers Application Programming Interfaces (APIs) to the users for access services on
the cloud and pays the charges as per the usage of service.
7) Unlimited storage capacity
Cloud offers us a huge amount of storing capacity for storing our important data such as documents,
images, audio, video, etc. in one place.
8) Data security
Data security is one of the biggest advantages of cloud computing. Cloud offers many advanced
features related to security and ensures that data is securely stored and handled.
Disadvantages of Cloud Computing
A list of the disadvantage of cloud computing is given below -
1) Internet Connectivity
As you know, in cloud computing, every data (image, audio, video, etc.) is stored on the cloud, and we
access these data through the cloud by using the internet connection. If you do not have good internet
connectivity, you cannot access these data. However, we have no any other way to access data from the
cloud.
2) Vendor lock-in
Vendor lock-in is the biggest disadvantage of cloud computing. Organizations may face problems when
transferring their services from one vendor to another. As different vendors provide different platforms,
that can cause difficulty moving from one cloud to another.
3) Limited Control
As we know, cloud infrastructure is completely owned, managed, and monitored by the service provider,
so the cloud users have less control over the function and execution of services within a cloud
infrastructure.
4) Security
Although cloud service providers implement the best security standards to store important information.
But, before adopting cloud technology, you should be aware that you will be sending all your
organization's sensitive information to a third party, i.e., a cloud computing service provider. While
sending the data on the cloud, there may be a chance that your organization's information is hacked by
Hackers.
2)
There are basically 5 essential characteristics of Cloud Computing .
1. On-demand self-services:
The Cloud computing services does not require any human administrators, user
themselves are able to provision, monitor and manage computing resources as needed.
2. Broad network access:
The Computing services are generally provided over standard networks and
heterogeneous devices.
3. Rapid elasticity:
The Computing services should have IT resources that are able to scale out and in quickly
and on as needed basis. Whenever the user require services it is provided to him and it is
scale out as soon as its requirement gets over.
4. Resource pooling:
The IT resource (e.g., networks, servers, storage, applications, and services) present are
shared across multiple applications and occupant in an uncommitted manner. Multiple
clients are provided service from a same physical resource.
5. Measured service:
The resource utilization is tracked for each application and occupant, it will provide both the
user and the resource provider with an account of what has been used. This is done for
various reasons like monitoring billing and effective use of resource.
1: Top-of-the-Line Perimeter Firewall
Most firewalls are very simple—they typically inspect a packet’s source and destination and that’s all.
Some more advanced firewalls feature stable packet inspection, which checks the integrity of the file
packets for stability issues prior to approving or rejecting the packet.
Top-of-the-line firewalls, such as Palo Alto Networks’ perimeter firewall solution will check the
contents of the file packet to examine the type of file in addition to source, destination, and integrity.
Such granularity is necessary to thwart the most advanced persistent threats out there today.
2: Intrusion Detection Systems with Event Logging
Numerous IT security compliance standards require businesses to have a means of tracking and
recording intrusion attempts. So, for any business that wants to meet compliance standards such as
PCI or HIPAA, using IDS event logging solutions is a must.
Some cloud providers offer monitoring for IDS, and will update their security rules for their firewalls
to counter threat signals and malicious IP addresses that they detect for all of their cloud users.
3: Internal Firewalls for Individual Applications, and
Databases
While having a strong perimeter firewall can block external attacks, internal attacks are still a major
threat. Infrastructures that lack internal firewalls to restrict access to sensitive data and applications
cannot be considered secure.
For example, a compromised employee user account can give hackers a way to bypass the perimeter
firewall almost entirely. Or, a disgruntled ex-employee with a valid account may try to abuse their
access privileges.
In either case, internal firewalls that keep individual applications, and databases separated can help
limit the damage an attack from the inside can do.
4: Data-at-Rest Encryption
Encrypting the data that is stored on your cloud infrastructure can be an effective way to keep your
most sensitive information from being accessed by the wrong party.
Strong encryption can minimize the risk of stolen data being used against your company or your
customers/clients before you have a chance to alert them so they can take steps to protect their
identities.
It’s better to have time to warn customers than to let hackers profit off of the stolen information
immediately.
5: Tier IV Data Centers with Strong Physical Security
The physical hardware used to run a cloud environment represents one last opportunity for hackers
and industrial spies to steal your most important data. When allowed direct access to the hardware
that runs the cloud, hackers have free reign to steal data or upload malware directly to your systems.
Hackers should never have this opportunity.
Tier IV data centers help protect cloud environments by restricting access to the physical systems that
run the cloud environment. A secure Tier IV facility will use measures such as:
Armed security patrols
Controlled access checkpoints with biometric security controls
24/7 CCTV monitoring
These security measures are critical for keeping unauthorized users from directly accessing the
hardware that runs your cloud.
3)
[Link]
4)
Cloud Storage:
Cloud storage is nothing but storing our data with a cloud service provider rather than on a local
system, as with other cloud services, we can access the data stored on the cloud via an Internet
link. Cloud storage has a number of advantages over traditional data storage. If we store our data
on a cloud, we can get at it from any location that has Internet access.
At the most rudimentary level, a cloud storage system just needs one data server connected to the
Internet. A subscriber copies files to the server over the Internet, which then records the data.
When a client wants to retrieve the data, he or she accesses the data server with a web- based
interface, and the server then either sends the files back to the client or allows the client to access
and manipulate the data itself.
Cloud storage systems utilize dozens or hundreds of data servers. Because servers require
maintenance or repair, it is necessary to store the saved data on multiple machines, providing
redundancy. Without that redundancy, cloud storage systems couldn’t assure clients that they
could access their information at any given time. Most systems store the same data on servers
using different power supplies. That way, clients can still access their data even if a power
supply fails.
Storage as a Service
The term Storage as a Service (another Software as a Service, or SaaS, acronym) means that a
third-party provider rents space on their storage to end users who lack the budget or capital
budget to pay for it on their own. It is also ideal when technical personnel are not available or
have inadequate knowledge to implement and maintain that storage infrastructure. Storage
service providers are nothing new, but given the complexity of current backup, replication,
and disaster recovery needs, the service has become popular, especially among
small and medium-sized businesses. Storage is rented from the provider using a cost-per-
gigabyte-stored or cost-per-data-transferred model. The end user doesn’t have to pay for
infrastructure; they simply pay for how much they transfer and save on the provider’s
servers.
A customer uses client software to specify the backup set and then transfers data across a WAN.
When data loss occurs, the customer can retrieve the lost data from the service provider.
Providers
They are hundreds of cloud storage providers on the Web, and more seem to be added each day.
Not only are there general-purpose storage providers, but there are some that are very specialized
in what they store.
• Google Docs allows users to upload documents, spreadsheets,and presentations to Google’s
data servers. Those files can then be edited using a Google application.
• Web email providers like Gmail, Hotmail, and Yahoo! Mail store email messages on their
own servers. Users can access their email from computers and other devices connected to the
Internet.
• Flickr and Picasa host millions of digital photographs. Users can create their own online photo
albums.
• YouTube hosts millions of user-uploaded video files.
• Hostmonster and GoDaddy store files and data for many client web sites.
• Facebook and MySpace are social networking sites and allow members to post pictures and
other content. That content is stored on the company’s servers.
• MediaMax and Strongspace offer storage space for any kind of digital data.
Security:
To secure data, most systems use a combination of techniques:
• Encryption A complex algorithm is used to encode information. To decode the encrypted files,
a user needs the encryption key. While it’s possible to crack encrypted information, it’s very
difficult and most hackers don’t have access to the amount of computer processing power they
would need to crack the code.
• Authentication processes This requires a user to create a name and password.
• Authorization practices The client lists the people who are authorized to access information
stored on the cloud system. Many corporations have multiple levels of authorization. For
example, a front-line employee might have limited access to data stored on the cloud and the
head of the IT department might have complete and free access to everything.
Reliability
Most cloud storage providers try to address the reliability concern through redundancy, but the
possibility still exists that the system could crash and leave clients with no way to access their
saved data.
Advantages
Cloud storage is becoming an increasingly attractive solution for organizations. That’s because
with cloud storage, data resides on the Web, located across storage systems rather than at a
designated corporate hosting site. Cloud storage providers balance server loads and move data
among various datacenters, ensuring that information is stored close to where it is used.
Storing data on the cloud is advantageous, because it allows us to protect our data in
case there’s a disaster. we may have backup files of our critical information, but if there is a fire
or a hurricane wipes out our organization, having the backups stored locally doesn’t help.
Amazon S3 is the best-known storage solution, but other vendors might be better for large
enterprises. For instance, those who offer service level agreements and direct access to customer
support are critical for a business moving storage to a service provider.
A lot of companies take the “appetizer” approach, testing one or two services to see how well
they mesh with their existing IT systems. It’s important to make sure the services will provide
what we need before we commit too much to the cloud.
Cloud Storage Providers
Amazon and Nirvanix are the current industry top storage providers.
Amazon Simple Storage Service (S3)
The best-known cloud storage service is Amazon’s Simple Storage Service (S3), which
launched in 2006.
Amazon S3 is designed to make web-scale computing easier for developers. Amazon S3
provides a simple web services interface that can be used to store and retrieve any amount of
data, at any time, from anywhere on the Web. It gives any developer access to the same highly
scalable data storage infrastructure that Amazon uses to run its own global network of web
sites. The service aims to maximize benefits of scale and to pass those benefits on to
developers.
Amazon S3 is intentionally built with a minimal feature set that includes the following
functionality:
• Write, read, and delete objects containing from 1 byte to 5 gigabytes of data each. The
number of objects that can be stored is unlimited.
• Each object is stored and retrieved via a unique developer-assigned key.
• Objects can be made private or public, and rights can be assigned to specific users.
• Uses standards-based REST and SOAP interfaces designed to work with any
Internet- development toolkit.
Design Requirements
Amazon built S3 to fulfill the following design requirements:
• Scalable Amazon S3 can scale in terms of storage, request rate, and users to support an
unlimited number of web-scale applications.
• Reliable Store data durably, with 99.99 percent availability. Amazon says it does not allow any
downtime.
• Fast Amazon S3 was designed to be fast enough to support high-performance applications.
Server-side latency must be insignificant relative to Internet latency. Any performance
bottlenecks can be fixed by simply adding nodes to the system.
• Inexpensive Amazon S3 is built from inexpensive commodity hardware components. As a
result, frequent node failure is the norm and must not affect the overall system. It must be
hardware-agnostic, so that savings can be captured as Amazon continues to drive down
infrastructure costs.
• Simple Building highly scalable, reliable, fast, and inexpensive storage is difficult. Doing so in
a way that makes it easy to use for any application anywhere is more difficult. Amazon S3 must
do both.
Design Principles
Amazon used the following principles of distributed system design to meet Amazon S3
requirements:
• Decentralization :It uses fully decentralized techniques to remove scaling bottlenecks and
single points of failure.
• Autonomy :The system is designed such that individual components can make decisions based
on local information.
• Local responsibility :Each individual component is responsible for achieving its
consistency; this is never the burden of its peers.
• Controlled concurrency :Operations are designed such that no or limited concurrency control
is required.
• Failure toleration :The system considers the failure of components to
be a normal mode of operation and continues operation with no or
minimal interruption.
• Controlled parallelism :Abstractions used in the system are of such
granularity that parallelism can be used to improve performance and
robustness of recovery or the introduction of new nodes.
• Small, well-understood building blocks :Do not try to provide a
single service that does everything for everyone, but instead build
small components that can be used as building blocks for other
services.
• Symmetry Nodes in the system are identical in terms of
functionality, and require no or minimal node-specific configuration
to function.
• Simplicity The system should be made as simple as possible, but no simpler.
How S3 Works
S3 stores arbitrary objects at up to 5GB in size, and each is accompanied by
up to 2KB of metadata. Objects are organized by buckets. Each bucket is
owned by an AWS account and the buckets are identified by a unique,
user-assigned key.
Buckets and objects are created, listed, and retrieved using either a
REST-style or SOAP interface. Objects can also be retrieved using
the HTTP GET interface or via BitTorrent.
[Link]