How to Use Amazon EC2 Instance Store Encryption to Protect Data at Rest?

1. Create Amazon S3 bucket: amazon ec2

The created S3 bucket stores the encrypted password file. Encryption of the file system happens using such a password or key. When a boot happens for an Amazon EC2 instance, the files are copied, the encrypted password is read, the password is decrypted, and the plaintext password is retrieved. Utilization of this password happens when encrypting the file system on the instance store disk. Through the first step, the creation of an S# bucket occurs to enable the storage of the encrypted password file on it. Application of necessary permissions happens afterward. Additional permissions to the bucket to enable endpoint access are necessary whenever using Amazon VPC endpoint for Amazon S3.

  1. Sign into the S3 bucket and select “Create Bucket”.
  2. Then, enter the bucket name in the box named “Bucket Name”, then click on “Create”.
  3. All the details of the newly created bucket will appear in the right pane.

2. Configure the IAM roles and permission for the created S3 bucket

Using AWS Key Management Service (KMS), the encrypted password could be decrypted after essentially the encrypted password file being read from S3. One could assume a role with the right access permissions to the bucket of S3 by applying the IAM policy which that is configured in this step. “your-bucket-name” is that bucket used for the purpose of saving and storing the password file on it.

  1. Sign into the AWS Management Console to reach the IAM console.
  2. Then go to the navigation pane to and select “policies”
  3. Afterward, click the “Create Policy” option.
  4. Then, select the “Create Your Own Policy” option.
  5. Get a name for the policy and a great description for it then proceed with the next step.
  6. Copy and paste the following policy at this point.
    {
    
        "Version": "2012-10-17",
    
        "Statement": [
    
            {
    
                "Sid": "Stmt1478729875000",
    
                "Effect": "Allow",
    
                "Action": [
    
                    "s3:GetObject"
    
                ],
    
                "Resource": [
    
                    "arn:aws:s3:::<your-bucket-name>/LuksInternalStorageKey"
    
                ]
    
            }
    
        ]
    
    }
  7. Then, select “Create Policy”.
  8. To elaborate on the previous policy, the bucket is granted through such policy to read. In other words, the encrypted password could be read because it is stored

    insidesuch bucket. The IAM role then needs configuration now since EC2 fundamentally uses the previous policy.

  9. One should select “Roles”  inside the IAM console.
  10. Choose “Create New Role” now.
  11. Inside the first step of “Role Name”, create a name for the role and then press “Next Step”.
  12. Inside the second step of “Select Role Type”, select “Amazon EC2” and then press “Next Step”.
  13. Inside the third step of “Established Trust”, press “Next Step”.
  14. Inside the fourth step of “Attach Policy”, select the policy created in the first step. The following figure illustrates this point in a more concise way. amazon ec2
  15. Inside the fifth step of “Review”, review the configuration before finishing the steps. The IAM role which we just created can be used now with any new launch of EC2 instances, having an access permission on encrypted password file stored in the S3 bucket.
  16. The newly created IAM role becomes listed on the page of “Roles” there.
  17. Finally, select “Roles” and then select the newly created role as illustrated by the upcoming image. class=

3.Encrypt a secret password with KMS and store it inside S3 bucket

In order to accomplish this step successfully, one has to utilize AWS CLI. Fortunately, EC2 Amazon Linux instances already have AWS CLI by default on them. One could further install it on Windows, Mac, or Linux systems as well.

  1. Type the following command in AWS CLI. It will make use of KMS to encrypt the password. Note that you should replace “region name” with your region. In addition, creating keys and putting objects in S3 requires specific permissions that must be present before typing this command.
    aws --region us-east-one kms encrypt --key-id 'alias/EncFSForEC2InternalStorageKey' --plaintext "ThisIs-a-SecretPassword" --query CiphertextBlob --output text | base64 --decode > LuksInternalStorageKey
    
    aws s3 cp LuksInternalStorageKey s3://<bucket-name>/LuksInternalStorageKey
  2. The file name “LuksInternalStorageKey” will have the encrypted password as per the last used command.
  3. The key alias or name, which is useful for identifying diverse keys, has the name “EncFSForEC2InternalStorageKey”

 

  1. Make the KMS key accessible by the role

  1. Get to the IAM console and especially the navigation pane and choose “Encryption keys”.
  2. Then, choose the key alias named “EncFSForEC2InternalStorageKey”.
  3. If a new role is desired to get installed, and it is actually desired, then “Key Policy” should be scrolled down to it where “Add” should be selected under “Key Users” amazon ec2
  4. At this step, choose the newly created role and then press “Attach”.
  5. Now, this grants the access permission of the key to the role.

 

  1. Configure EC2 with role and configurations run

  1. Launch a new instance inside the EC2 console. But inside the third step “Configure Instance Details”, the IAM role has to be selected as shown in the following figure. amazoon ec2
  2. Expand the section of “Advanced Details” to the previously displayed screen.
  3. Inside “User Data, keep “As text” checked as it is by default. Then, copy and paste the following script into the text box.
    #!/bin/bash
    ## Initial setup to be executed on boot
    
    ##====================================
    
    
    # Create an empty file. This file will be used to host the file system.
    
    # In this example we create a 2 GB file called secretfs (Secret File System).
    
    dd of=secretfs bs=1G count=0 seek=2
    
    # Lock down normal access to the file.
    
    chmod 600 secretfs
    
    # Associate a loopback device with the file.
    
    losetup /dev/loop0 secretfs
    
    #Copy encrypted password file from S3. The password is used to configure LUKE later on.
    
    aws s3 cp s3://an-internalstoragekeybucket/LuksInternalStorageKey .
    
    # Decrypt the password from the file with KMS, save the secret password in LuksClearTextKey
    
    LuksClearTextKey=$(aws --region us-east-1 kms decrypt --ciphertext-blob fileb://LuksInternalStorageKey --output text --query Plaintext | base64 --decode)
    
    # Encrypt storage in the device. cryptsetup will use the Linux
    
    # device mapper to create, in this case, /dev/mapper/secretfs.
    
    # Initialize the volume and set an initial key.
    
    echo "$LuksClearTextKey" | cryptsetup -y luksFormat /dev/loop0
    
    # Open the partition, and create a mapping to /dev/mapper/secretfs.
    
    echo "$LuksClearTextKey" | cryptsetup luksOpen /dev/loop0 secretfs
    
    # Clear the LuksClearTextKey variable because we don't need it anymore.
    
    unset LuksClearTextKey
    
    # Check its status (optional).
    
    cryptsetup status secretfs
    
    # Zero out the new encrypted device.
    
    dd if=/dev/zero of=/dev/mapper/secretfs
    
    # Create a file system and verify its status.
    
    mke2fs -j -O dir_index /dev/mapper/secretfs
    
    # List file system configuration (optional).
    
    tune2fs -l /dev/mapper/secretfs
    
    # Mount the new file system to /mnt/secretfs.
    
    mkdir /mnt/secretfs
    
    mount /dev/mapper/secretfs /mnt/secretfs
  4. On your account, enable CloudTrail.
  5. Finally, launch the EC2 instance. Such instance will copy the password file from S3, use KMS to decrypt the file, and configure an encrypted file system.

 

References

https://aws.amazon.com/blogs/security/how-to-protect-data-at-rest-with-amazon-ec2-instance-store-encryption/ http://searchhealthit.techtarget.com/definition/HIPAA https://aws.amazon.com/s3/ https://digitalguardian.com/blog/what-data-encryption https://en.wikipedia.org/wiki/Advanced_Encryption_Standard http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/InstanceStorage.html

How does Amazon AWS deal with data encryption?

What is PCI DSS?
amazon aws

To begin with, the Payment Card Industry Data Security Standards (PCI-DSS) provides a checklist with which organizations dealing with online credit card payments have to comply. Such list ascertains organizations follow the appropriate security standards to prevent any breach cases from occurring. Otherwise, merchants who refuse to comply face with great financial penalties thereafter.

What is HIPAA?
amazon aws

Health Insurance Portability and Accountability Act (HIPAA) is legislation which is concerned with making the medical information as safe as possible through making sure of data privacy and other security provisions.

Cyber attacks and ransomware attacks deploy upon health insurance data including both insurers and providers of such service. Such attacks are of a great concern of HIPAA. HIPAA aims to protect such sensitive data from any potential breaches on contaminated systems.

Why Data Encryption at Rest?


amazon aws

An essential requirement which both PCI DSS and HIPAA enforces to be applied inside an organization’s system is to have the sensitive data either cardholder data or health insurance data respectively in an encrypted format.

Before we proceed on to this point, let’s get some more insight about the notion of data encryption. Encrypting data means having data in another form, or code. This is such that having access to the decryption key is a must to understand such stored data.

Almost all organizations depend on this technique since it has been extremely popular and effective in securing data. Getting into more details, there are two sorts of encryption commonly in use around the globe. These are asymmetric encryption or the public key encryption method, and the symmetric encryption.

Symmetric encryption has privilege over the asymmetric encryption due to its speed. During the process, an exchange of the encryption key occurs between the sender and recipient before decrypting it.

Accordingly, huge quantities of keys have to get distributed and managed by companies in order for them to be capable of utilizing such encryption method. Therefore, it has become usual for companies to use a symmetric algorithm to encrypt data. After this, we use an asymmetric algorithm for the sake of exchanging the secret key.

On the other hand, asymmetric encryption or public-key cryptography uses two different keys: one public and one private. At the same time when a public key is known and everyone can share it, the private key is highly protected for security purposes.

One of the most widely used encryption algorithms is Sharmir-Adleman (RSA) algorithm. One could secure sensitive data through such an algorithm which depends on the public key cryptography technique.  The insecure network just as the internet is a perfect place to harness such an algorithm.

The confidentiality, integrity, authenticity, and non-reputability of electronic communications and data are assured by the use of such algorithm which encrypts data using both the public and private keys before sending it to an insecure network. Digital signatures are used within such process as well.

What is AES?


AED

The Advanced Encryption Standard (AES) also known as Rijndael is a means of encrypting data. Originally, the U.S. National Institute of Standards and Technology created such specification back in 2001.

With the evolution of such standard, the Data Encryption Standard (DES) became superseded and not used anymore for any advanced encryption purposes in organizations seeking high levels of security. The US government also adopted AES and made use of it in data encryption.

The symmetric key encryption algorithm is adopted by such standard, which means that encrypting and decrypting the data both use the exact same key.

What does Amazon S3 offer in this regard?

amazon s3

Amazon Simple Storage Service is a service where collecting, storing, and analyzing data of different formats and sizes could be possible and easy. Through Amazon Web Services (AWS), one can store and retrieve back.

Sources of such data could vary from websites and mobile apps to corporate applications, and data from sensors or devices of the Internet of Things (IoT).  Media storage and distribution have the capability to depend on Amazon S3. This is such as the “data lake” for big data analytics. Even computation applications which are serverless can utilize Amazon S3.

Mobile device photos and videos or other captured data, backups of mobile or other devices, backups of a machine, log files generated by a machine, streams created by an IoT sensor and images which are of a high resolution could all efficiently make use of Amazon S3.

It is then possible to configure Buckets of Amazon S3 for server-side encryption (SSE) making use of AES-256 encryption.

What can Amazon EC2 offer for decryption?
amazon ec2

One could use instance storage on Amazon EC2. Such instance storage allows for data to become stored in a temporary period of time. Information that frequently changes, such as buffers, caches, and scratch data are the mostly stored on such instance storage but in an unencrypted format.

One could utilize Linux dm-crypt in this process. It is essentially a Linux kernel-level encryption mechanism. It is possible to mount an encrypted file system, making it available to the operating system. Then, applications can easily deal with all files in the file system with no more needed interactions.

Dm-crypt basically resides between the physical disk and the file system. Data becomes encrypted when writing it from the operating system into the disk as shown in the following figure.

Finally, it is important to note that an application never knows a thing about such encryption. That is due to the fact that applications use a specific mount point to store and retrieve files. In the meanwhile, encryption occurs on such data during storage in the disk. Therefore, there is no use of data if the hard disk becomes stolen or lost.

amazon aws
amazon aws

References

https://aws.amazon.com/blogs/security/how-to-protect-data-at-rest-with-amazon-ec2-instance-store-encryption/

http://searchhealthit.techtarget.com/definition/HIPAA

https://aws.amazon.com/s3/

https://digitalguardian.com/blog/what-data-encryption

https://en.wikipedia.org/wiki/Advanced_Encryption_Standard

http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/InstanceStorage.html

 

PCI and MFA – what it means to you

PCI and MFA – what it means to you

PCI

I just finished reading the PCI Guru blog post about Multi-Factor Authentication. This is found in the Payment Card Industry Data Security Standard (PCI DSS). PCI Guru Jeffrey Hall explains that requirement 8.3.1 doesn’t go into effect until February 1st, 2018. However, it states that you should:

“Incorporate multi-factor authentication for all non-console access into the CDE for personnel with administrative access”

CDE stands for Cardholder Data Environment. Jeffrey in his blog post states that several organizations already have MFA implemented across the entire network. Therefore, they believe that they are already compliant. Furthermore, Jeffery cites a more particular snippet of requirement 8.3.1:

“If the CDE is segmented from the rest of the entity’s network, then an administrator needs to use multi-factor authentication when connecting to a CDE system from a non-CDE network. Multi-factor authentication is moreover implementable at the network level or at system/application level; therefore it does not have to be both. If the administrator uses MFA when logging into the CDE network, then, they do not also need to use MFA to log into a particular system or application within the CDE.”

Jeffrey further makes his point by saying:

“We need to remember what drove the development of requirement 8.3.1 was a lesson learned from the Target and similar breaches. In all of these breaches, system administrators were spear phished allowing the attackers to access the CDE in one way or another. Requirement 8.3.1 minimizes this threat by requiring MFA to gain access to the CDE. So even if an attacker obtains an administrator’s credentials or compromises an administrator’s system, that fact in and of itself would not compromise the CDE.

This is why the guidance for 8.3.1 puts the MFA border at the CDE. If you have MFA implemented in order to gain access to your network, how does that stop the threat of phishing? It does not. A spear phishing attack against such an MFA implementation defeats the MFA because it is already applied. The MFA in this scenario does not stop access to the CDE.”

Source:

https://pciguru.wordpress.com/2017/04/10/mfa-it-is-all-in-the-implementation/

PCI
—-

Ok, as much as I’m someone that heavily uses 2-factor authentication as well as recommends it to customers please don’t think that it’s a silver bullet. Yes, attackers have bypassed 2-factor/MFA solutions for quite some time now.

Jeffrey, your recommendation isn’t wrong. However, the logic of the explanation on why to use it for the CDE is. Yes, especially relevant, tell the customer to implement MFA for the cardholder data environment.

Rather, explain and insist to the customer to use MFA to access the cardholder environment. We need not go knee deep explaining that if the attacker uses a spear-phishing attack against the MFA implementation, which, in addition, is applied to your entire network – that somehow, spear phishing will not affect the MFA implementation that is applied to the cardholder data environment

 

Besides, the council has released some guidance on MFA, and you can look at it here:

I certainly liked this article from a technical perspective:

http://blog.securitymetrics.com/2016/10/2-things-know-about-32-multi-factor-authentication-updates.html

Similarly, I liked this article most from an Assessor/Security professional’s point of view:

https://blog.anitian.com/pci-dss-3-2-multi-factor-authentication-clash/

Particularly, I love how Adam Gaydosh finishes the blog post

“While the PCI Community meeting is always good for keeping up on the latest issues, we find that now more than ever, the PCI-DSS needs pragmatism. The debates over MFA are interesting from an academic perspective but offered little practical insight, other than the fact that folks are quite to argue a position without understanding the details. MFA is still one of the best ways to shrink an attack surface area and increase security.

This debate also shows how PCI, like any complex standard, quickly devolves into nitpicking debates over minutiae. This is particularly why hands-on technology experience is such an important skill for any PCI assessor. Finally, your auditor must have the ability to translate the intent of the standard into the operational realities of your environment.”

Source:

https://blog.anitian.com/pci-dss-3-2-multi-factor-authentication-clash/