Chapter 15
Security and Privacy
“The ultimate security is your understanding of reality.”
—H. Stanley Judd
We leave security to the end of the book not because it is an afterthought but
because it is a cross-cutting concern. Indeed, it is important for every aspect of
how you work with the cloud. In this chapter we start with the general question
of what the cloud provider provides you as a security baseline and what aspects
of security are your responsibility. Then we turn to the question of how best
to protect your data, computations, and services. We showed in part I of this
book that managing data in the cloud is relatively easy, being supported by both
intuitive cloud portals an d programming APIs. Here we look deeper into the best
practices of d ata protection. Similarly, we talked in part II about computing in
the cloud using VMs, containers, and clusters of both. However, we said virtually
nothing about securing those VMs or man agin g the networks that connect them.
In this chapter we introduce you to critica l issues that you need to consider when
deploying these computing resources. Finally, we take a brief look at how to use
higher-level services securely.
15.1 Thinking about Security i n the Cloud
One frequently cited reason for not using the cloud is concern about security. This
concern is understandable: after all, data on your own computer are clearly under
your control, while data on the cloud are somewhere unknown. But does this
dierence in location mean that your data are more or less secure? In general, your
15.1. Thinking about Security in the Cloud
data should be more secure in the cloud than on your personal compu ter. This
observation may appear paradoxical, but remember: operators of cloud services
are information technology professionals whose livelihoods depend on preventing
intrusions. The sam e cannot be said of most, if not all, readers of this book.
But just because cloud data centers are secure does not necessarily mean that
your data are secure. In fact, your concerns about cloud security may be justified if
you do not take proper care. So let’s look at some best practices for cloud security.
The first point to remember is that any time that your computing infrastructure
is exposed to the Internet, security is a concern. That is, a ny service that accepts
and processes messages communicated over the network is potentiall y vulnerable to
attack. This caution applies to your personal, home, laboratory, and institutional
research systems as well as the cloud. Have you installed all the latest security
patches on your personal machines? Is your institution’s data center well managed?
If you have secured your own systems, you have closed o the most com mon way
for an intruder to access your cloud resources: the biggest risk in cloud computing
is probably someone breaking into your personal computer and then accessing your
cloud resources from there.
Securing your personal computers is out of scope for this book, so we now turn
to cloud security issues proper. Three main areas merit your attention, each of
which we consider in a subsequent section.
1. Secure data that you move to the cloud
2. Secure access to the virtual machines and containers that you create
3. Use cloud software services in a secure manner
In each of these three areas, much of the responsibility for the security of your
cloud data rests with you. As shown i n figure 15.1 on the following page, the
cloud provider manages security of the cloud, implementing and operating security
measures that protect the cloud infrastructure. However, security in the cloud is
your responsibility: you are the one who defines the security mechanisms that you
deploy to protect your own content, platform, applications, systems, and networks.
The situation is much the same as if your applications were running in an on-site
data center, except that cloud data centers raise unique concerns.
Chapter 15. Security and Privacy
Figure 15.1: Shared responsibility model, as defined by Am azon [55].
Bad security can be expensive: A story of unexpected charges
. This
cautionary tale, unfortunately not unprecedented in cloud computing, is from Quora:
My AWS account was hacked and I have a $50,000 bill. how can I
reduce the amount I need to pay?
For years, my bill was never above $350/month on my single AWS instance. Then
over the weekend someone got hold of my priv ate key and launched hundreds of
instances and racked up a $50,000 bill before I foun d out about it on Tuesday.
What happene d? The cause here, and it turns out a surprisingly common cause
for such events, is that the user had included an Amazon access key (see section 3.2
on page 38) in code that the user pushed to a public GitHub repository. The bad
actor found the key via a scan of publicly accessible GitHub repositories and then
used it to perform BitCoin mining.
Such stories usually have a happy ending, in that Amazon generally seems willing
to waive such charges if they are shown to result from fraud. But clearly you do
not want this misfortune to happen to you. The most important step that you can
take to avoid it is not to expose your Amazon access key(s), wh ethe r on GitHub or
elsewhere. Particularly if you are using Amazon only for simple testing, you may not
think of an access key as being espe cially valuable. B ut as this example shows, it is.
You need to protect it as if it were worth many thousands of dollars.
15.1. Thinking about Security in the Cloud
Many proactive steps can help you guard against unexpected charges. Amazon
describes a long list of b es t practices that you are well advised to follow [
Important ideas are: protect your keys, for example by enabling multifactor
authentication; never share keys with other us ers (instead, create new IAM users to
which you grant required permissions); follow the principle of least privilege when
creating IAM users (i.e., configure them to be able to perform only the actions
that you expect them to perform, such as only read storage or only access storage
in certain regions); an d monitor usage and billing.
The Amazon
service can be used to monitor a variety of metrics,
including billing, and to define threshold s upon which email alerts are generated.
Make sure to read your email! Another useful service, not for detecting illicit
behavior but for recovering afterwards, is Amazon
. This service allows
you to obtain a history of Amazon API calls and related events for your account.
On Azure, you can use the
Azure security center
service to obtain an
analysis of all data and compute resources that you have deployed on your account.
This service can, for example, scan each of your data containers to determine their
encryption and access status. The Azure
Threat Analytics
service is designed
to detect abnormal behavior, malicious attacks, and other security issues in your
environment. Azure also provides tools to enable
application whitelisting
which you declare which applications are allowed to access your resources.
Table 15.1 provides a complementary perspective to that depicted in figure 15.1,
showing the level of shared resp on si bility for security across dierent levels of
services and for software as a service, platform as a service, infrastructure as a
service, and on-premises d eployments. It shows for each level of service which
responsibilities are the cloud provider’s, which are yours, and which are shared.
In an on-premis es deployment, al l responsibility lives with you—or, hopefully,
your system administrators. On the cloud, responsibilities shift increasingly from
you, or whoever is managin g your cloud services for you, to the cloud provider,
Table 15.1: Microsofts shared resp onsibility model [235].
SaaS PaaS IaaS On prem
Data classification & accountability
You You You You
Client & endpoint protection
Shared You You You
Identity & access management
Shared Shared You You
Application-level controls
Provider Shared You You
Network controls
Provider Provider Shared You
Host infrastructure
Provider Provider Shared You
Physical security
Provider Provider Provider You
Chapter 15. Security and Privacy
as you move up the stack from IaaS to SaaS. At the highest level, you are always
responsible for ensuring that your data are correctly identified, labeled, and
classified. Client and en dpoint protection, that is, ensuring that the devices that
connect to cloud services are co rrectly configured, are also usually tasks for which
you are responsible. Identity and access management are solely your responsibility
when you are working with IaaS, as discussed earlier; for SaaS and PaaS, the
provider often handles access control, subject to policies that you define.
Responsibility for application-level controls, such as applying up-to-date OS
patches, configuring application software correctly, and ensuring that application
software has no security holes, rests s olel y with the SaaS provider, who has full
control over their application software. In the PaaS case, it is a shared responsibility
of you and the provider, since application code is a mix of your code and PaaS
code. In the IaaS case, responsibili ty rests solely with you.
The bottom three levels in the table are concerned with correctly configuring
networks and providing necessary controls, such as VPNs; correctly configuring
virtual machines, containers, storage systems, and other infrastructure elements;
and (a topic that you may never have thought of, but one that is certainly
important for sensitive data) addressing the physical security of the devices on
which applications run. Responsibilities for these elements rest solidly with the
provider in the case of SaaS and PaaS; for IaaS, responsibilities are shared.
15.2 Role-based Access Control
In academic settings, facu lty commonly want to allow trusted postdocs and students
to access their public cloud account, but in a way that grants those other individuals
only restricted rights. A security construct call ed a
is widely used in public
clouds for such purposes. Each time an account owner adds another indi vidu al to
a cloud account, the owner specifies the role(s) that the n ew user has; each role
defines something that th e user is authorized to do.
In the case of Azure, the
role-based access control
(RBAC) [
] system
allows you to control h ow dierent parties use resources under your account. Each
new user must have a role: either a general role like “Contributor” or a more specific
role like “Data Lake Analytics Developer” or “SQL DB Contributor.” For example,
a Contributor can use the resources but cannot grant access to another user; a
Data Lake Developer can use only d ata lake services. The account manager can
also monitor the usage made of your resources by dierent authorized individuals.
15.3. Secure Data in the Cloud
In the case of the Amazon cloud, the
Identity and Access Management
(IAM) service [
] that we introduced in section 7.6 on page 110 and discuss further
below, provides similar capabilities. You can create a variety of dierent IAM roles
and assign those roles to users, applications , and services. An IAM role defines
who the user is and what that user is authorized to do. It also gives you a way to
monitor the use by holders of dierent roles. Google’s cloud uses a similar IAM
system [1], as well as a system of access control lists.
15.2.1 Sharing Secrets among Containers in a Cluster
As we discussed in section 7.6.2 on page 112, containers present ad di tiona l security
issues, especially when the container instance is a stateless microservice that
interacts with other services, because many instances of this container may then
be stopped and started a s needed. The problem one encounters is that these
container instances may need to access various secrets, such as the API keys,
identities, and passwords of the services that they invoke. While you can pass the
keys to individual instances from the command line, that approach does not work
for instances that are managed dynamically. Leaving the keys in the container
Dockerfile is not secure, because they are then embedded in the Docker image.
As we have sh own in section 7.6.4 on page 114, Amazon’s IAM role system
solves this problem for the Amazon container service, ECS. RBAC solves the
similar problem for Azure. When using the Docker Swarm services to manage a
collection of containers, you can use the
docker secret create
command to send
Docker a secret that is sent securely to the swarm manager, where it is en crypted.
When an authorized micros ervice is launched, the Swarm manager sends that
secret to the microservice, where it is stored in an in-memory file system that is
deleted when the microservice container is deleted [181].
15.3 Secure Data in the Cloud
Commercial cloud vendors operate highly secure data centers. While intrusions
are certainly possible, su ch failures i n operational security appear to be extremely
rare. Thus, the two principal vulnerabilities for data in the cloud are, first, data
in transit to and from the cloud and, second, unauthorized access due to failure
by the user to set proper access permissions. (We do not discuss the case of data
that are given up by a cloud vendor because of government court orders. Here we
encounter complex international legal issues and national data sovereignty laws
that are well beyond the scope of this book and the expertise of its authors. As
Chapter 15. Security and Privacy
far as we know, entities such as the National Security Agency (NSA) do not have
backdoor access to cloud data centers. But what do we really know? If your level
of paranoia about the NSA’s interest in your data is not too high, read on.)
15.3.1 Secure Data in Transit
A potential weak link is in the Internet between you and the data center. For
example, when you create a VM, you can specify which ports you want to leave
open; if you are careless, you may leave some of these open to attack. The Python
SDKs that we introduced in chapter 3 use
Transp ort Layer Security
Secure Socket Layer, or SSL) connections (or HTTPS ) to transfer your data. This
is the same security mechanism that you use when you interact with onl in e banking
services. Experts consider it to be secure, although as we see bel ow, they also
recommend encrypting sensitive data for additional protection.
If moving data with Globus Transfer, you can request that data be en-
crypted prior to transfer. When using the Python SDK, you need only to set
. In addition, Globus endpoints can be configured to force en-
cryption for all transfers involving that endpoint, whether as source or destin ation .
This option is always enabled for Amazon S3 endpoints, for exampl e, as indicated
in the fine print at the bottom of figure 3.8 on page 53.
15.3.2 Control Who Can Access Your Data
A second potential source of unwanted access is incorrectly configured access
controls. When you upload data to your storage accounts, you are responsible for
managing who and what can access those data. As described previously, role-based
access control can allow you to restrict access from your team or your services to
the data storage system. However, you must use dierent mechanisms to restrict
access from external collaborators or the public to specific buckets. Typically, you
are able to control whether data are accessible to all, to no one except yourself,
or to only named individuals. A misconfigured access control specification (or,
as discus sed previously, improper release of a key) can easily result in the wrong
people seeing your data.
Fortunately, access controls are easy to configure. In the case of Azure blob
and table storage, the storage account has two associated keys, prosaically named
. We typically think of
as the
master key
; that is what we
used i n the APIs, as discussed in section 3.3 on page 42. You should never share
the master key with others. As we have previously described, you can give
15.3. Secure Data in the Cloud
collaborators, who then have full access to the storage account. You can regenerate
either key if you want to terminate access. Within a storage account you create
containers, and for each container you can grant dierent types of public access:
no pu bl ic access, public read access to all blobs in the container including listing
them, or public access to blobs by name only. If you want to provide just a single
individual with access to a container, you can use a shared access signature (SAS).
This is a powerful mechanism for granting limited access to objects in your storage
account to others, without having to expose your account key. Generating a SAS
signature or setting these access controls is easy to do from the portal or the Azu re
SDK or from the Azure Storage Explorer running on your PC or Mac.
Amazon and Google have similar capabilities, diering only in the details, for
managing access to their cloud storage
The Globus Auth service and API described in chapter 11 provide powerful
authorization mechanisms that are used by Glo bus Sharing, for example, to allow
user control over access to data at Globus endpoints.
15.3.3 Encrypt Your Data
You may sometimes want to go beyond the secu rity provided by access controls.
In particular, if disclosure of your data would be especially damagi ng, as when
dealing with sensitive data pertaining to human subjects (see section 15.3.4 on
the following page), you may wish (or be required) to ensure that those data a re
encrypted at rest
. In other words, you want to ensure that your data are always
encrypted when on cloud storage and are decrypted only when they need to be
read, retrieved, or used for computation. In so doing, you can protect ag ain st
disclosure due to mistakes made when setting access controls or breaches in cloud
data-center security. Amazon and Azure both support two ways to encrypt data
at rest: server-side encryption and client-side encryption.
Server-side encryption
allows you to ask that the clou d vendor automatically
encrypt data on arrival in the cloud and then decrypt that data automatically each
time that you access them. For example, Amazon S3 all ows you to request, when
uploading data to S3, that server-side encryption be performed; Amazon then
performs that encryption (and decryption on subsequent accesses) transparently.
In Python, you simply add a third line to the code on page 41, as follows.
# Upload the file 'test . jpg ' into the newly created bucket
s3 . Object ( 'datacont ', ' test .jpg '). put (
Body=open( '/home/mydata/test.jpg' , ' rb '),
ServerSideEncryption=' AES256 ')
Chapter 15. Security and Privacy
Amazon manages keys for you, encrypting each object with a unique key and
encrypting that key itself with a master key that it regularly rotates. (A variant
permits you to provide your own key with each upload and access request, so
that the key is on Amazon computers only while b ei ng used for encryption or
decryption.) As a further safeguard, you can obtain access to an audit trail of
when your key was used and by whom. So as long as you trust Amazo n to manage
and apply your keys appropriately, this approach is highly secure.
Storage Service Encryption
provides si mil ar capabilities for the
Azure Blob service; the Google Cloud Datastore service has similar functionality.
The services dier somewhat in how they allow users to co ntrol the application of
server-side encryption. Amazon allows the user to require that all data uploaded
to a container be encrypted; however, the encryption request must still be made on
individual uploads, as indicated previously. (An attempt to upload data without
the encryption parameter then raises an error.) Azure allows the user to enable
encryption at the level of a storage account, a construct that we introduced
in section 3.3 on page 42; once enabled, all data uploaded to that account are
encrypted. Google Cloud Datastore always encrypts.
Client-side encryption
is useful when you want to ensure that the cloud
provider never has access to your unencrypted data. Amazon and Azure bo th
provide too ls that you can use to encrypt data before they are sent over the
wire. You might use these tools, for examp le, to create a secure backup of data
otherwise mai ntained in on-premises storage, particularly if regulatory requirements
prevent unencrypted data from leaving your premises. But note that you are then
responsible for preserving the keys (as you are with server-side encryption, if you
provide the keys): if you lose a key, the data that it encrypted are also l ost.
15.3.4 Complexities of Sensitive Data
If your work involves access to personal health data o r other sensitive information,
then you are likely subject to various rules and regulations that will aect whether
and how you can use cloud resources. For example, in the U.S., work with
health information
(PHI) must comply with the provisions of the
Insurance Portability and Accountability Act
(HIPAA) and in particular its
Security R ule
, which mandates administrative, physical, and technical safeguards
for electronic PHI. The processes by which a particular institution and application
are deemed to be HIPAA compliant are complex and beyond the scope of this
book. The important takeaway points are that (1) the major commercial cloud
vendors can al l satisfy HIPAA physical security standards, but (2) this does not
mean that you can just put HIPAA-covered data in the cloud and consider yourself
15.4. Secure Your VMs and Containers
compliant with HIPAA regulations. You must ensure that your entire end-to-end
computing infras tructure is compliant, and thus managing HIPAA d ata requires
your institution’s involvement and supervision.
One way to simplify the process of making a cloud-based computing infras-
tructure H IPAA compliant is to bring the cloud inside your institution’s security
boundary. This task can be accomplished in various ways by the cloud vendors.
For example, Azure can create a special VPN that places a virtual secure partition
of the Azure cloud directly into your network. Those cloud resources share your IP
domain and can be accessed within your firewall. You will need your IT department
to work with the cloud provider to set this up.
15.4 Secure Your VMs and Containers
You launch a VM or container on a cloud by using the methods described in
chapter 4. What secu rity threats do you need to be concerned about in this
situation, and what should you do to overcome them? We have already talked
about the need to protect the access key that you use to authenticate to your cloud
provider when creating VM instances or containers. We are concerned here with
what happens after that point. Figure 15.2 shows important activities performed
when using VMs, and associated security risks.
Figure 15.2: Hexagons showing four classes of risk associated with virtual machines, as im-
ages are retrieved from a repository; VM instances are created, execute, and communicate
on a cloud; and modified images are added to a repository.
15.4.1 Poisoned VM or Container Image
Any time that you run a VM image or launch a container that you did not create
yourself, the danger exists that the container has unwanted code that may, for
Chapter 15. Security and Privacy
example, make your private data accessi ble to others, participate in illicit activities
such as denial of service attacks on other computers, or corrupt your computational
results. Another concern is that a downloaded VM image may not be up to date
with security patches and thus is vulnerable to attacks.
We overcome these con cerns i n much the same way as when in stall ing software
on a personal computer: we verify its source, ensure that it is up to date with any
patches, and run it within a secure environment.
Verification of the source of a VM or container requires understanding the
provenance of the image. In the case of VMs, each cloud vendor supplies a collection
of trusted images that you can deploy; the cloud vendor also provides free malware
tools that you may install once your image is runni ng. In the case of containers,
another solution is to provide a secure hash along with the image. This is a key
that can be used to verify that the co ntainer image has not been tampered with.
You can then use the hash key as part of the
docker pull
command. Mouat [
provides an excellent overview of Docker security.
15.4.2 Illicit Access to Running VMs
Once created, a VM instance is little dierent from a computer running in your
home, laboratory, or institutional data center. Thus, the risks that you face when
running a VM on a cloud are essentially the same as those that you encounter when
working on a physical computer. So too are the controls that you should deploy
for protection. An impo rtant dierence is that in the clo ud case, you have more
responsibility for implementing the necessary controls. Lest that last statement
appear daunting, we point out that public cloud providers provide tools that make
it easy to do the right thing. The following are important steps.
Limit who can access the instance. You need to limit portal access, as anyone
with account management access can change instance properties. If you
wish to grant access only to the VM, then go to the VM, add the user with
sudo adduser
(assuming it is Linux), and add their public key to the newly
created user’s .ssh directory.
Ensure that the credentials that allow access to the instance are not compro-
mised. Thus, for example, create the instance with a key pair (see section 5.2.1
on page 75), and make sure that the private key is well protected.
Ensure that the software runni ng on an instance is up to date with all sec u r i ty
patches, bearing in mind that a VM image downloaded from a repository
may require updating before running, as noted above.
15.4. Secure Your VMs and Containers
If you run web applications such as Jupyter or a web service, make sure the
network ports that they use are open and the software listening on those
ports is not subject to known exploits. Run Jupyter with a key pair and
password, as described in section 6.2. If you want more than one user to
have access to Jupyter, it is better to run the JupyterHub multiuser system
, rather than sharing one Jupyter instance and
password, because Jupyter can open a shell on your account.
15.4.3 Intercepted Communications
The best way to secure the virtual services that you manage in the cloud is to
remove them from the Internet by placing them in a
virtual private network
(VPN). A VPN is a layer on top of an existing network defined by point-to-point
encrypted tunnels or a set of routes through a software defined network that carry
encrypted packets. A VPN carries its own IP addresses and subnets that are not
recognized as being part of the Internet.
You can set up a VPN in a number of ways; the choice depends on the network
you need to create. For exam ple, suppose you have a set of vi rtual machines i n
the cloud that you want to be on a private network that includes only those VMs
and your laptop. Or, perhaps you have a private network in your lab or company
that you want to extend to include your cloud resources. Or you may want some
of your web services to be open to the public cloud, but you want those services
also to be able to connect via a VPN to other servers not visible to the Internet.
Each public cloud allows you to use their cloud portal to create a VPN that solves
your specific problem . While the details of setting up a VPN are beyond the scope
of this book, each cloud vendor provides extensive tutorials to guide you through
the process. In the case of OpenStack th e process is similar, depending on the
specific OpenStack deployment.
15.4.4 Information Leakage via VM Image
We all commonly share images with our col leag ues, either directly or via public
repositories. As when pushing code to GitHub, you need to make sure that the
images that you share do not contain credentials or other confidential information .
It is even more important if you modify a pub lic image and then push that image to
an image repository for others to use. Bugiel et al. [
] tested 1,100 Amazon AMI
EC2 images from Europe and the U.S. and found that abo ut one-third contained an
SSH back d oor: a public key that allows remote access to the instance. They were
able to use this back door to extract AWS API keys, private keys, and credentials,
Chapter 15. Security and Privacy
as well as private data from many instances. Amazon warns users about this
problem when Amazon discovers it, but you are well advised, whenever you clone
an image from any repository, to look for any
files in user home
directories and delete them .
15.5 Secure Access to Cloud Software Services
The role-based access control systems described above define how you can delegate
access and use of cloud-provided servers to your users, applications, and containers.
However, this mechanism does not address the issue of how to control access to
the services that you create for others and host in the cloud. These external users
are not the people on your team that built the service and that have autho rization
to use your cloud account. These users are “customers” of the service, and you
want to authorize them i nd ivid ual ly.
In developing such mechanisms, SSL and HTTPS are certainly important, as
are passwords. You can create access control lists that can add some protection if
you have a way to authenticate your users. Another solution to the authentication
problem is to use a third-party authentication system. We have a ll seen online
services that allow us to login using our Facebook or Google identity and password.
The Azure app service provides a simple tool that you can use to enable Facebook,
Google, or Microsoft as the authentication provider for your service.
The Globus tools can also help with authentication and authorization. We
described in chapter 11 how Globus Auth can be used to authenticate with a
variety of identity providers, including many university authentication systems,
and h ow access tokens can be used for authorization. We also described how the
Globus Auth SDK can be used to develop services and clients that apply these
mechanisms. The Globus Genomics system described in chapter 14 illustrates the
use of thes e mechanisms.
15.6 Summary
Cloud security is a major concern for many users, as it should be. It is also a
complex topic that touches on so many cloud capabilities. In this chapter, we have
presented only a light overview of the topic. If you are using the cloud, the first
important issue is controlling who can access your data. Two types of access are
involved here: people on your team to whom you have granted cloud accounts and
external people with whom you want to share your data. Dierent mechanisms
15.7. Resources
handle each case. IAM roles are good for the former case, and access keys and
secure signatures are good for the latter. Globus Sharing is another powerful tool
for s harin g data securely with collaborators.
The second issue that we addressed is controlling access to your VMs and
containers. Major issues to consider here are poisoned images, illicit access,
intercepted communication, and information leakage. These concerns are handled
with a variety of mechanisms. For poisoned images, the key is ascertaining the
provenance of the image, deciding whether you trust it, and installing all provided
malware protection. For illicit access, you must make sure that the software
that you are running is free of exploits an d that you ma nage access through
SSH by adding the correct public keys. If you are concerned about intercepted
communications, then you need to consider using a VPN. Information leakage
is similar to the illicit access probl em. A common problem is SSH back doors
on VMs that can be easily used to compromise your VM. You need to check the
authorized_keys file in the .ssh directory to make sure it is clean.
We also considered the problem of sharing secrets among containers. We
described how Amazon’s IAM role system, Azure’s RBAC, and Docker’s swarm
secret sharing systems make such sharing possible. Moreover, we described how
you can address security in the services that you create and expose to the world.
15.7 Resources
The NIST Cloud Computing Security Reference Architecture [
] provides an
extensive review of cloud security issues and approaches. The Cloud Security
produces and collects much relevant material
on technologies and best practices. Chen et al. [
] and Hashizume et al. [
provide useful reviews of i ssu es. Amazon [
] and Azure [
] review best practices
for VM security management on their systems. Huang et al. [
] survey academic
research relating to IaaS security.
Illustrating the complexity of th e security compliance environment, Amazon lists
more than 50 information sources and documents
relating to certification and attestation programs in which they participate; laws,
regulations, and privacy policies that apply i n dierent contexts and countries;
and security related frameworks. Few of these rules apply to most work on the
cloud, but this breadth of material emphasizes the importance of getting expert
guidance when working with sensitive data, whether in the cloud or elsewhere.