How to Use Amazon EC2 Instance Store Encryption to Protect Data at Rest?

1. Create Amazon S3 bucket: amazon ec2

The created S3 bucket stores the encrypted password file. Encryption of the file system happens using such a password or key. When a boot happens for an Amazon EC2 instance, the files are copied, the encrypted password is read, the password is decrypted, and the plaintext password is retrieved. Utilization of this password happens when encrypting the file system on the instance store disk. Through the first step, the creation of an S# bucket occurs to enable the storage of the encrypted password file on it. Application of necessary permissions happens afterward. Additional permissions to the bucket to enable endpoint access are necessary whenever using Amazon VPC endpoint for Amazon S3.

  1. Sign into the S3 bucket and select “Create Bucket”.
  2. Then, enter the bucket name in the box named “Bucket Name”, then click on “Create”.
  3. All the details of the newly created bucket will appear in the right pane.

2. Configure the IAM roles and permission for the created S3 bucket

Using AWS Key Management Service (KMS), the encrypted password could be decrypted after essentially the encrypted password file being read from S3. One could assume a role with the right access permissions to the bucket of S3 by applying the IAM policy which that is configured in this step. “your-bucket-name” is that bucket used for the purpose of saving and storing the password file on it.

  1. Sign into the AWS Management Console to reach the IAM console.
  2. Then go to the navigation pane to and select “policies”.
  3. Afterward, click the “Create Policy” option.
  4. Then, select the “Create Your Own Policy” option.
  5. Get a name for the policy and a great description for it then proceed with the next step.
  6. Copy and paste the following policy at this point.
        "Version": "2012-10-17",
        "Statement": [
                "Sid": "Stmt1478729875000",
                "Effect": "Allow",
                "Action": [
                "Resource": [
  7. Then, select “Create Policy”.
  8. To elaborate on the previous policy, the bucket is granted through such policy to read. In other words, the encrypted password could be read because it is storedinsidesuch bucket. The IAM role then needs configuration now since EC2 fundamentally uses the previous policy.
  9. One should select “Roles”  inside the IAM console.
  10. Choose “Create New Role” now.
  11. Inside the first step of “Role Name”, create a name for the role and then press “Next Step”.
  12. Inside the second step of “Select Role Type”, select “Amazon EC2” and then press “Next Step”.
  13. Inside the third step of “Established Trust”, press “Next Step”.
  14. Inside the fourth step of “Attach Policy”, select the policy created in the first step. The following figure illustrates this point in a more concise way. amazon ec2
  15. Inside the fifth step of “Review”, review the configuration before finishing the steps. The IAM role which we just created can be used now with any new launch of EC2 instances, having an access permission on encrypted password file stored in the S3 bucket.
  16. The newly created IAM role becomes listed on the page of “Roles” there.
  17. Finally, select “Roles” and then select the newly created role as illustrated by the upcoming image. class=

3.Encrypt a secret password with KMS and store it inside S3 bucket

In order to accomplish this step successfully, one has to utilize AWS CLI. Fortunately, EC2 Amazon Linux instances already have AWS CLI by default on them. One could further install it on Windows, Mac, or Linux systems as well.

  1. Type the following command in AWS CLI. It will make use of KMS to encrypt the password. Note that you should replace “region name” with your region. In addition, creating keys and putting objects in S3 requires specific permissions that must be present before typing this command.
    aws --region us-east-one kms encrypt --key-id 'alias/EncFSForEC2InternalStorageKey' --plaintext "ThisIs-a-SecretPassword" --query CiphertextBlob --output text | base64 --decode > LuksInternalStorageKey
    aws s3 cp LuksInternalStorageKey s3://<bucket-name>/LuksInternalStorageKey
  2. The file name “LuksInternalStorageKey” will have the encrypted password as per the last used command.
  3. The key alias or name, which is useful for identifying diverse keys, has the name “EncFSForEC2InternalStorageKey”


  1. Make the KMS key accessible by the role

  1. Get to the IAM console and especially the navigation pane and choose “Encryption keys”.
  2. Then, choose the key alias named “EncFSForEC2InternalStorageKey”.
  3. If a new role is desired to get installed, and it is actually desired, then “Key Policy” should be scrolled down to it where “Add” should be selected under “Key Users” amazon ec2
  4. At this step, choose the newly created role and then press “Attach”.
  5. Now, this grants the access permission of the key to the role.


  1. Configure EC2 with role and configurations run

  1. Launch a new instance inside the EC2 console. But inside the third step “Configure Instance Details”, the IAM role has to be selected as shown in the following figure. amazoon ec2
  2. Expand the section of “Advanced Details” to the previously displayed screen.
  3. Inside “User Data, keep “As text” checked as it is by default. Then, copy and paste the following script into the text box.
    ## Initial setup to be executed on boot
    # Create an empty file. This file will be used to host the file system.
    # In this example we create a 2 GB file called secretfs (Secret File System).
    dd of=secretfs bs=1G count=0 seek=2
    # Lock down normal access to the file.
    chmod 600 secretfs
    # Associate a loopback device with the file.
    losetup /dev/loop0 secretfs
    #Copy encrypted password file from S3. The password is used to configure LUKE later on.
    aws s3 cp s3://an-internalstoragekeybucket/LuksInternalStorageKey .
    # Decrypt the password from the file with KMS, save the secret password in LuksClearTextKey
    LuksClearTextKey=$(aws --region us-east-1 kms decrypt --ciphertext-blob fileb://LuksInternalStorageKey --output text --query Plaintext | base64 --decode)
    # Encrypt storage in the device. cryptsetup will use the Linux
    # device mapper to create, in this case, /dev/mapper/secretfs.
    # Initialize the volume and set an initial key.
    echo "$LuksClearTextKey" | cryptsetup -y luksFormat /dev/loop0
    # Open the partition, and create a mapping to /dev/mapper/secretfs.
    echo "$LuksClearTextKey" | cryptsetup luksOpen /dev/loop0 secretfs
    # Clear the LuksClearTextKey variable because we don't need it anymore.
    unset LuksClearTextKey
    # Check its status (optional).
    cryptsetup status secretfs
    # Zero out the new encrypted device.
    dd if=/dev/zero of=/dev/mapper/secretfs
    # Create a file system and verify its status.
    mke2fs -j -O dir_index /dev/mapper/secretfs
    # List file system configuration (optional).
    tune2fs -l /dev/mapper/secretfs
    # Mount the new file system to /mnt/secretfs.
    mkdir /mnt/secretfs
    mount /dev/mapper/secretfs /mnt/secretfs
  4. On your account, enable CloudTrail.
  5. Finally, launch the EC2 instance. Such instance will copy the password file from S3, use KMS to decrypt the file, and configure an encrypted file system.



How to become PCI compliant and still be breached?

 What do I need to know PCI?

  • PCI DSS and PCI SSC:
    pci compliant

For the sake of minimizing this risk, there exists the Payment Card Industry Data Security Standard (PCI DSS). It is designed and set by the PCI Security Standards Council (PCI SSC). In a nutshell, it aims at securing online transactions to a great extent.

Credit card companies put these standards into action themselves. The standards compel every retailer to comply with a long checklist if they accept any means of online payments. This ensures that transactions operate in a completely safe environment. Such lists mainly strive to make sure that data and networks are highly secure.

  • Social Engineering:

    pci compliant

A computer security professional has to perform sorts of psychological manipulations on the suspects. This is to know who is responsible for the occurrence of an attack or another similar security incident.

Such terms get an extensive use when talking about information security in general. This is because someone in the organization could reveal confidential data. Those responsible for information security ought to detect and investigate such persons.

In a way or another, many consider this as a confidence trick. The rationale behind it at the end of the day varies from information gathering to fraud, or system access. It is often one of the many steps in a more complex fraud scheme. It is used for diverse social sciences, yet computer security is the main domain of it.

There are plenty of techniques one could utilize for the sake of performing a social engineering action. Instances of such methods are: pretexting, diversion theft, phishing, spear phishing, water holing, baiting, quid pro quo, tailgating, and many others.

  • SSL/TLS/IPsec

    pci compliant

To ensure secure transmission of data packets across a network, one can depend on three internet protocols. This is in order to make such data secure as much as possible while in transit. Internet Protocol Security (IPsec) is capable of performing mutual authentication between agents when the session begins. Transmission of cryptographic keys occurs during the session. This is either between a pair of hosts (host-to-host), between a pair of security gateways (network-to-network) or between a security gateway and a host (network-to-host).

It is important to understand that IPsec works for the Internet Layer. Elsewhere, there are two other similar internet protocols operating in different upper layers. The Transport Layer Security (TLS) operates in the Transport Layer. On the other hand, Secure Shell (SSH) functions in the Application Layer.

An insight into the problem

While there are several organizations which claim their complete compliance with PCI DSS, many still suffer attacks by actual breaches. This leads to losing a lot of their money and investment. We have to dig into the grounds of this problem in order to be able to take accurate actions accordingly.

First of all, let’s discuss the reasons for such status

  1. One hundred percent security is impossible and is unachievable by any means. Although standards of PCI DSS are completely awesome and leading to much more secure online payment methods, they can never become the end or the ultimate goal of an organization.

There can never be perfect security for an organization. That is why banks still experience robberies up to date regardless of how secure they are. The only advantage of such standards lies in the fact that the number of successful robberies becomes much less, but it never vanishes.
pci compliant

  1. Several methods are undertaken to manipulate the controls which are compliant with PCI standards. This leads to a breach even when there is PCI compliance for the organization. The following points discuss the said methods:
    1. Imagine that a professional attacker freshly develops a malware. This attacker manages to get his malware through all the antivirus or anti-malware security systems inside the organization. This fact is pretty interesting. This is because such new malware usually has no signature to make it recognizable by an anti-malware software. Consequently, even while there is an antivirus running on the organization’s network or system, new malware could pass through without detection at the very beginning.
      pci compliant
    2. As it is known, a malware has just to find its way into the network and desired data could be collected in time. But how do you think the malware could get into the network in the first place? The answer is social engineering and spear phishing attack. This term refers to those emails which seem as if they are from a friend or someone inside an organization. However, the one who sent such emails was the same individual who attempts to attack the personal data such as passwords, credit card number, bank account numbers, and the financial information on your personal computer (PC). One way to perform such attack effectively is to send a link from a bunch of the organization’s email addresses to the addresses of other peers inside the same organization. Thereafter, when one simply clicks on the link on, the malware goes viral inside the network. That is why security training is highly recommendable to cut off the hazardous numbers of such attacks.
      pci compliant
    3. The problem here is that everything seems as if they are normal with no existence of a threat of such malware. Why is that? Fundamentally, when an attacker launches the malware that scans a network for open ports or other vulnerabilities, the scans are run in a very slow manner such that no heavy traffic generation occurs as a result of such scans. This fact leads to recognizing the traffic as if it is just normal. On the other hand, when a penetration tester attempts to scan a network vulnerability, high traffic generation occurs. It’s then detected as someone who tries to scan the network.  
      1. pci compliant
      2. Furthermore, the backdoor software utilized by an attacker depends on protocols such as SSL/TLS/IPsec. They depend on them to encrypt their transmissions on port numbers 80 or 443 which are both open for getting on the internet. Such encrypted packets are not usually recognizable as malware by antivirus software programs.


How to apply additional security measurements besides PCI DSS?

What do I need to know about PCI DSS?

  • PCI DSS and PCI SSC:

For the sake of minimizing this risk, there exists the Payment Card Industry Data Security Standard (PCI DSS). It is designed and set by the PCI Security Standards Council (PCI SSC). In a nutshell, it aims at securing online transactions to a great extent.

Credit card companies put these standards into action themselves. The standards compel every retailer to comply with a long checklist if they accept any means of online payments. This ensures that transactions operate in a completely safe environment. Such lists mainly strive to make sure that data and networks are highly secure.

  • Spear phishing:


Spear phishing is mainly a type of attack underlined by the art of social engineering. In this type of phishing, few end users receive customized emails. This is in an attempt to get their private information in a fraudulent manner.

A philosophical question now to ask is this. If spear phishing describes the previously explained behavior; What is then the difference between normal phishing and spear phishing?

Whereas phishing targets a large group of people to send emails to them with no prior research expecting that a few number of people will send a response, spear phishing targets a specific group of people to send them customized emails. This occurs after concise research on such a group of people. They are targeted with the correct message in which they are expected to respond positively and get tricked subsequently.

Phishing attacks reach a significant number of people. However, they receive a very small success rate from the number of links their links receive. Nevertheless, spear phishing attacks do not have that large number of target group but around half of such group click on the sent links or open the attachments.

  • Tokenization


Whenever sensitive data is dealt with, tokenization has to be mentioned. This is in order to ensure that such data is secured to a great extent. Fundamentally, such valuable information gets replaced with a token number which is of no actual use except for merely this process. A token number is a number which makes no sense for an attacker or even for whoever uses it. It gets mapped back to its valuable specific piece of information associated with it.

In this process, the need for a tokenization system is a must, where such tokens could be requested, generated, and detokenized back to get the data. Therefore, such data becomes secured to the maximum using this method. There is still one aspect which should be cared about; it is the security of the tokenization system in the first place. Such system has to get secured following best security practices such as standards of sensitive data protection, secure storage, audit, authentication, and authorization.

  • Jump Server


It is essential to understand the concept of a jump server when talking about network and security and securing the data flow within it. Devices in a separate security zone could be managed through such jump server. One of the most commonly used example for such concept is the demilitarized zone (DMZ). It could be managed by trusted networks or computers through a jump server.

A jump server has to have specific administrators who have authorized credentials on it for the sake of gaining access to DMZ for instance. All other requested access attempts from non-authorized users have to get logged for next audit. This server could work as single audit point for traffic, securing the data inside DMZ to the maximum.

How to apply additional security measurements besides PCI DSS?

  1. An organization’s administration should recognize the potential of being breached at any instant of time in the first place. Security standards could be set very high, and they could be followed very strictly. Whereas still, any security incident could still occur. Well, what is the benefit of security controls then? They are mainly meant to get the number of such events as much minimal as possible. Besides, such controls make the probability of obtaining sensitive data very low. How is that? Imagine that an organization was breached, such restrictions shall play a significant role in identifying a risk or an attack before an attacker gets his desired information from the network.
  2. Highly sensitive data should no longer be saved or stored in the system. This is because as long as they are there, there is always a vulnerability in the system which could be exploited to get such valuable information. On the other hand, if an organization or a merchant is obliged to get such data saved. Then, tokenization is the perfect solution for this case, for not saving data on the system.
  3. Get any sensitive data isolated inside the network or the organization’s system. In this regard, approaches like of the model of Forrester’s “Zero Trust” or McGladrey Ultra Secure could be followed. This is to ensure a very high level of security on sensitive data.


  4. Another attractive solution is to minimize the number of authorized accesses to sensitive data. Accordingly, whenever an incident happens to occur, there should be a small focus group on which social engineering could be applied by information security responsible persons.
  5. A “jump box” or a “jump server” should be made use of in order to force users to log into such server first of all before getting any access to sensitive data. The cardholder data environment shall be restricted to those who have the capability to correctly log into the jump box. This could be further coupled through using different credentials required for the sake of gaining access to such data. All activity performed on the jump box could be also captured by adding in full instrumentation of the jump box. Subsequently, the jump box could be monitored for any suspicious accesses.
  1. Internet Protocol (IP) addresses should be limited to the people inside an organization. While all traffic using HTTP or HTTPS should still be open for all the business’ use to satisfy their needs, they cannot be though unrestricted to access any desired IP address or URL. The solution for this is to apply proper white or black list IP addresses. Accordingly, an attacker will not simply work from any IP address or URL to play around with the network.



How does Amazon AWS deal with data encryption?

What is PCI DSS?
amazon aws

To begin with, the Payment Card Industry Data Security Standards (PCI-DSS) provides a checklist with which organizations dealing with online credit card payments have to comply. Such list ascertains organizations follow the appropriate security standards to prevent any breach cases from occurring. Otherwise, merchants who refuse to comply face with great financial penalties thereafter.

What is HIPAA?
amazon aws

Health Insurance Portability and Accountability Act (HIPAA) is legislation which is concerned with making the medical information as safe as possible through making sure of data privacy and other security provisions.

Cyber attacks and ransomware attacks deploy upon health insurance data including both insurers and providers of such service. Such attacks are of a great concern of HIPAA. HIPAA aims to protect such sensitive data from any potential breaches on contaminated systems.

Why Data Encryption at Rest?

amazon aws

An essential requirement which both PCI DSS and HIPAA enforces to be applied inside an organization’s system is to have the sensitive data either cardholder data or health insurance data respectively in an encrypted format.

Before we proceed on to this point, let’s get some more insight about the notion of data encryption. Encrypting data means having data in another form, or code. This is such that having access to the decryption key is a must to understand such stored data.

Almost all organizations depend on this technique since it has been extremely popular and effective in securing data. Getting into more details, there are two sorts of encryption commonly in use around the globe. These are asymmetric encryption or the public key encryption method, and the symmetric encryption.

Symmetric encryption has privilege over the asymmetric encryption due to its speed. During the process, an exchange of the encryption key occurs between the sender and recipient before decrypting it.

Accordingly, huge quantities of keys have to get distributed and managed by companies in order for them to be capable of utilizing such encryption method. Therefore, it has become usual for companies to use a symmetric algorithm to encrypt data. After this, we use an asymmetric algorithm for the sake of exchanging the secret key.

On the other hand, asymmetric encryption or public-key cryptography uses two different keys: one public and one private. At the same time when a public key is known and everyone can share it, the private key is highly protected for security purposes.

One of the most widely used encryption algorithms is Sharmir-Adleman (RSA) algorithm. One could secure sensitive data through such an algorithm which depends on the public key cryptography technique.  The insecure network just as the internet is a perfect place to harness such an algorithm.

The confidentiality, integrity, authenticity, and non-reputability of electronic communications and data are assured by the use of such algorithm which encrypts data using both the public and private keys before sending it to an insecure network. Digital signatures are used within such process as well.

What is AES?


The Advanced Encryption Standard (AES) also known as Rijndael is a means of encrypting data. Originally, the U.S. National Institute of Standards and Technology created such specification back in 2001.

With the evolution of such standard, the Data Encryption Standard (DES) became superseded and not used anymore for any advanced encryption purposes in organizations seeking high levels of security. The US government also adopted AES and made use of it in data encryption.

The symmetric key encryption algorithm is adopted by such standard, which means that encrypting and decrypting the data both use the exact same key.

What does Amazon S3 offer in this regard?

amazon s3

Amazon Simple Storage Service is a service where collecting, storing, and analyzing data of different formats and sizes could be possible and easy. Through Amazon Web Services (AWS), one can store and retrieve back.

Sources of such data could vary from websites and mobile apps to corporate applications, and data from sensors or devices of the Internet of Things (IoT).  Media storage and distribution have the capability to depend on Amazon S3. This is such as the “data lake” for big data analytics. Even computation applications which are serverless can utilize Amazon S3.

Mobile device photos and videos or other captured data, backups of mobile or other devices, backups of a machine, log files generated by a machine, streams created by an IoT sensor and images which are of a high resolution could all efficiently make use of Amazon S3.

It is then possible to configure Buckets of Amazon S3 for server-side encryption (SSE) making use of AES-256 encryption.

What can Amazon EC2 offer for decryption?
amazon ec2

One could use instance storage on Amazon EC2. Such instance storage allows for data to become stored in a temporary period of time. Information that frequently changes, such as buffers, caches, and scratch data are the mostly stored on such instance storage but in an unencrypted format.

One could utilize Linux dm-crypt in this process. It is essentially a Linux kernel-level encryption mechanism. It is possible to mount an encrypted file system, making it available to the operating system. Then, applications can easily deal with all files in the file system with no more needed interactions.

Dm-crypt basically resides between the physical disk and the file system. Data becomes encrypted when writing it from the operating system into the disk as shown in the following figure.

Finally, it is important to note that an application never knows a thing about such encryption. That is due to the fact that applications use a specific mount point to store and retrieve files. In the meanwhile, encryption occurs on such data during storage in the disk. Therefore, there is no use of data if the hard disk becomes stolen or lost.

amazon aws
amazon aws



How to Address the PCI DSS Requirements for Data Encryption in Transit Using Amazon VPC?


The Payment Card Industry Data Security Standards (PCI DSS) provide a checklist. All organizations dealing with online credit card payments have to comply with it. Companies follow such lists in order to ascertain appropriate security standards that prevent any breach cases from occurring. Otherwise, merchants who refuse to comply face great financial penalties thereafter.


One of the essential points of this compliance list is to encrypt the data in transit. Amazon Virtual Private Cloud (Amazon VPC) provides a means for organizations to get their data stored on the cloud without their physical interference with it. Furthermore, PCI DSS grasps Amazon VPC as a qualified private network.


Throughout this article, I will discuss in some clear points. They will discuss how to deal with Amazon VPC. This is more particularly when moving inside or outside it in order to eventually meet the PCI DSS Requirements for Data Encryption

  1. Understand the security of Amazon VPCPCI DSS

The first step of the solution is to fundamentally grasp the pure isolation witnessed by Amazon VPC. Several restrictions exist upon Amazon VPC:

  • Hosts outside Amazon VPC cannot reach Amazon Elastic Compute Cloud (EC2) instances to communicate without an internet gateway or a virtual private gateway.
  • Mapping services, offered by Amazon Web Services (AWS) layer 2 networking features, prevent any guest from entering the Amazon VPC address zone.
  • Data inside an Amazon VPC is entirely isolated from all other VPCs through firm restrictions. So, even within this private cloud data is isolated from each other.
  • Security groups and Network access control lists (NACLs) monitor and further control inbound and outbound traffic into and from Amazon VPC. This supports the recommendations of PCI Qualified Security Assessor (QSA) in a simple manner.
  1. Understand what the PCI-DSS tells about encryption

    PCI-DSS requirements clearly state that transmitted card details or other sensitive data must undergo encryption while passing through open public networks.

  • Through public networks, Wide Area Networks (WAN) are utilized to the internet or to partner networks to connect organization’s networks with each other.
  • Public networks also allow for inbound and outbound traffic. In this case, this is through physical and referenced gateways on a customer-premises equipment (CPE).
  • However, this is not the case for Amazon VPC which is principally a software-defined cloud
  • Such private cloud could understand underlying physical hardware and still keep the data on cloud isolated
  • PCI-DSS does not require encryption when transmitting data through a private network that has specifications as those of Amazon VPC.
  • Therefore, it is not actually important to encrypt the data when using Amazon VPC due to all the aforementioned security options and procedures.
  1. Encrypt the data, but

  • Be aware that in order to encrypt data before transferring it into or out of the cloud, you will need Transport Layer Security (TLS) between the original host and Amazon VPC.
  • Such end-to-end encryption results in a performance pitfall. This is so because it slows down the processes in operation while transmitting data. Organizations which intend to apply this end-to-end encryption must balance such overhead.
  • Look at the following example. It illustrates the notion of slowing down the performance. let’s consider a standard web application basically designed with Elastic Load Balancing (ELB) to include up to 5 encryption/decryption points.PCI DSS
  • Now, consider adding two more points for the sake of the proposed end-to-end encryption. This shul depending on the utilization of a web application firewall (WAF). This results in seven encryption/decryption points.PCI DSS
  • Whenever there is a new connection to an AWS service or any other applications, there will be an addition of extra encryption/decryption points. The overhead becomes more complicated and performance will increasingly suffer more and more.
  • Make sure to balance between all of such elaborated overhead and the actual needed performance for the desired application using Amazon VPC.
  1. Increase the Amazon PCI isolation more

Albeit the enormous isolation that we talked about before, there are some ways to increase this isolation and further strengthen it.

  • Security groups and NACLs should be all configured. This even supports the checks of PCI DSS.
  • PCI QSA or any other PCI consultants should be always there inside the organization for emergencies or security precautions. Any incident could occur and they have to issue solutions at the spot.
  • There should be a minimal number of public subnets,  comparably known as the demilitarized zone (DMZ) in the PCI DSS.
  • Configure a Network Address Translation (NAT) in the public subnet to outbound the data, whereas the rest of hosts should be located each in private subnets.
  • Network traffic isolation of all its instances could be relatively enhanced when source/destination checks are enabled to them.
  • Either the WAF layer or the front-end ELB layer should be terminal points of the TLS connections inside the public subnet. Private networks should, on the other hand, communicate without TLS connections.
  1. Use code to encrypt sensitive data (Optional)PCI DSS

This approach is not a mainstream approach as it needs some creative code making. However, this method is really adopted by some organizations to secure their highly sensitive data.

  • Allow specific application servers to have access to the keys used for encryption. These servers will be harnessed for the decryption process.
  • A stricter method  encrypting the data via public keys before transmission through the web server. Moreover, private keys are recognized by certain application servers to ensure that data is encrypted through the entire transmission journey.
  • The privilege granted by this method of encryption lies fundamentally on the performance side. No extra encryption/decryption points are added following this technique.

In a nutshell, Amazon VPC is arguably one of the most isolating private cloud in offer by AWS. Not only does it offer a secure place for maintaining sensitive data, but also it fulfills PCI DSS requirements. The addition of Encryption/Decryption points could add more security. However, balancing the performance of the application is necessary while at it. Utilization of alternative ways such as codes is necessary to achieve high data encryption.



How to Achieve PCI Compliance?

Throughout this article, I will attempt to highlight the main steps that one could adhere to, in order to perfectly attain a PCI compliance for an organization.

  1. Understand the importance of the matter

Online shopping is a growing trend according to the buying and selling tendencies witnessed nowadays. Many buyers are though always in a great doubt when dealing with e-commerce website or online shops. This is due to the fact that not all of such sites are secure enough. Credit card details are always in a risk of compromise as a result of such online transactions. What if a retailer just abuses the credit card data or does not secure it appropriately?

For the sake of minimizing this risk, there exists the Payment Card Industry Data Security Standard (PCI DSS). The PCI Security Standards Council (PCI SSC) designs and sets these standards. In a nutshell, they aim at securing online transactions to a great extent. The credit card companies execute these standards themselves. Every retailer ought to comply with a long checklist before it accepts any means of online payments. This ensures that transactions do operate in a completely safe environment. Such list mainly strives to make sure that data is highly secure and networks are secure as well.

  1. Know what is meant by PCI CompliancePCI Compliance

There are two types of credit card data that need security based on their sensitivity and vitality level. There are some relatively low sensitive data. These include the credit cardholder name, the expiry date of such credit card, service code, and the Primary Account Number (PAN). On the other hand, high sensitive data include PIN blocks, data on the magnetic stripe or similar chip, and CAV2/CVC2/CVV2/CID.

There are several recent security incidents which occurred as a result of both small and large organizations dealing with credit card data. While some security incidents may occur due to poor security practices, the rest result from attacks and

  • Some large retailers got their network hacked, revealing millions of credit card details that were compromised thereafter.
  • Other small organizations keep credit card details unencrypted on an old Personal Computer (PC). They do this without paying any tiny attention to maintaining any security levels towards such sensitive data.
  • Some other relatively big retailers update credit card details on an advanced server, yet they ignore encrypting such data for security.


  1. Determine PCI Compliance LevelsPCI Compliance

There is one mere case where PCI compliance is not necessary. Whenever Software as a Service (SaaS) is in use and a merchant does not get any access to credit card details, then it does not have to operate under PCI standards compliance. Otherwise, it does not really matter whether how big or small an organization is, or whether a retailer holds an on-premise or self-hosted cloud commerce solution. Retailers who utilize online payments ought to meet and comply with PCI standards.

Now, there is what we call PCI Compliance level with level one indicating the maximum level to achieve and as the level number increases, the level of strictness decrease. Accordingly, level four is considered as the least strict level. Determining a level mainly relies on both online payment gateways and store point-of-sale means summed up all together taking into consideration both credit and debit card transactions. For example, if the number of transactions conducted by a retailer is really high while their online transactions are comparably low, this seller has to conform to the highest level of PCI standards.

In this regard, I can divide these levels into three essential categories:

  • More than one million transactions per year move an organization into level one and two
  • More than twenty-thousand transactions make an organization follow level three
  • Less than twenty-thousand transactions move an organization into level four
  1. Avoid any non-compliance penaltyPCI Compliance

A very common dilemma which small and medium businesses (SMBs) usually encounter lies in the fact that PCI compliance cost them some amount of money, although their levels are between 4 and 3, while their non-compliance lead them to a breach which in turn results in a great penalty imposed for the following year. For instance, such violating businesses are automatically considered as they are in level one for the one year, costing them a miserably tremendous amount of money.

In some other cases, these merchants are charged with tedious fines, or they are forced to afford costly judicial audits. Banks may even decide to cut off a non-compliant business or apply additional fines on them.

  1. Hire someone qualified for PCI compliancePCI Compliance

This step is necessary if an organization needs to accurately comply with PCI standards. Not only will this provide a real insight about the measurements to take in order to achieve CPI compliance, but also it will benefit the company from avoiding any fines or unneeded charges. For most organizations follow level three or four, the role of a PCI consultant is of great importance. This will assist such organization in achieving three main security features:

  • Evaluate the security team exist and investigate to what extent credit card details are secure.
  • Issue methods to overcome any existing security issues, related to credit card details. To illustrate, it is always recommendable to get the assistance from a qualified third-party company. They could store such critical information instead of the organisation itself.
  • Report all these remediation records that took place in the security structure of an organization to concerned banks and card brands.
  1. Contact the merchant bank and know what documents need to be submitted according to the type of businessPCI Compliance

  2. Complete the Self-Assessment Questionnaire (SAQ)PCI Compliance

The type of an issued SAQ depends in the first place on the last step. This step counts into the last step where organizations of level three and four could complete SAQ by that hired buddy for PCI standards. Being a short survey ranging between five and six pages, it is tempting for SMBs to fabricate the answers. The SMBs ought to answer the survey questions honestly in order to assure a successful achievement of CPI compliance. Similarly, this evades the hassle of receiving a troublesome penalty as discussed earlier.





Encrypting data at rest is essential in ensuring the protection of sensitive data stored on disks. Similarly, it is essential for regulatory compliance. It should have a valid key for use before readability of data. HIPAA and PCI DSS are compliance regulations that require encryption for data at rest, throughout its data lifecycle.
Amazon web services supply options for data-at-rest and key management to sustain the encryption process. Similarly, Amazon enables encryption of EBS volumes and configures its S3 buckets for SSE or server side encryption using AES-256 Encryption. Amazon also supports the TDE or Transparent Data Encryption.

Further, instance storage temporarily stores Amazon EC2 instances in its block level storage. The storage exists on physical disks attached to the host computer. Most noteworthy, instance storage allows for temporary storage of frequently changing data. These include caches, buffers, and scratch data. By default, however, these data contains non-encrypted files.
This section shows the encryption method applied on the Linux EC2 instances. Additionally, it does transparent encryption that effectively protects confidential information. Applications that use the data cannot detect the disk-level encryption.

File system and Disk Encryption

In summary, there are two encryption methods for instance stores. One of them is the File-system-level encryption where only files and the directories are encrypted. It is portable across OS and operates above the file system.
Next, we have the Disk encryption. Disk encryption encrypts a block of the disk or the entire disk itself using one or several encryption keys. This approach, on the other hand, operates below the file system and hides file and directory information such as name and size. It is also OS-agnostic.

Linux dm-crypt Infrastructure

The Linux kernel-level encryption method features the dm-crypt that gives users permission to mount the encrypted file system. Mounting file systems is done by attaching to a mount point or directory that makes it available to the OS. Next, the file systems are made available to applications and do not require added interactions. Such files are encrypted when stored on the disk.

Moving on, Device Mapper found in the Linux 2.6 and Linux 3.x kernel is an infrastructure that creates a way to develop block devices into layers. The crypt target of this infrastructure makes a transparent encryption using kernel crypto API. Arguably, Dm-crypt combined with the disk backup file system is the solution of choice, mapped to the Logical Volume Manager (LVM).
The diagram, for this purpose, shows the Amazon dm-crypt relationship with application and file system. It sits between the file system and the physical disk. Furthermore, data written from the OS to the disk is encrypted.

Most of all, the application cannot detect the disk-level encryption because it is not made aware. When applications are using a directory or mount point to retrieve files, storing files into disk makes them encrypted. This, for that reason, renders the files useless if the drive is stolen or lost.


Meanwhile, try to create a new file system that is dm-crypt encrypted and name it “secretfs.” The file uses LVM and LUKS or Linux Unified Key Setup to encrypt it. The EC2 instance disk storage contains the file system.

A diagram traces the newly encrypted file is located in the EC2 internal disk storage. Applications that need to save confidential data will use “secretfs” as the mount point temporarily (‘/mnt/secretfs’)to save the sratch file.



Especially relevant, this solution requires three sets of actions for it to work. Firstly, perform the EC2 Launch Config on boot because the file is created at boot time. Full control over every step should be granted and revoked by an administrator. This, specifically, is to aid in file system creation or to access keys.
Next, log every decryption and encryption request using the AWS CloudTrail. This is rather critical when creating keys. It is also critical when seeking to unlock an encrypted file system.

Lastly, integrate other AWS services to the solution. There are four services included. This section describes each of them.
First, the AWS Key Management Service or KMS enables the creation of encryption keys and controlling keys in encrypting data. This service uses envelope encryption which has a master key on top of a data key. Most noteworthy, the master key can decrypt and encrypt up to 4 KB of data.
Second, the AWS CloudTrail records and logs request back and forth the KMS. This data is used for auditing in the future. Similarly, it helps to monitor API calls for the account.
Amazon S3 is a storage feature of the AWS. It saves the password for the encrypted file system.
Next, AWS IAM or Identity and Access Management enables control on the secure access to AWS services. It allows access to S3 bucket (reading encrypted password) and KMS (decrypt password).

To implement the solution:

1. Initially, create an S3 bucket. Doing this stores the file that contains the encrypted password. This password is used to encrypt the chosen file system.
Next, sign into the s3 Console, and click Create Bucket. Afterward, type your chosen Bucket name in the box and press Create. As a result, the right pane will show the new bucket you created.

2. Next, configure the IAM policy to grant permission to the S3 bucket. Configuration is done in the bucket name you created that stores the encrypted password. To start, you need to create an IAM policy.


Subsequently, choose IAM console and select Roles, and select Create New Role. Type the Role Name and hit Next Step. Then, click Next Step for Established Trust. In the Attach Policy, choose the IAM policy you set up.

Certainly, launching EC2 instances requires the use of the newly created IAM role. This grants the permission for accessing the encrypted password in the S3 bucket. The newly setup role should display on the Roles page.


3. At this point, Use the KMS to encrypt a password. Encrypting a text with KMS must have AWS CLI. Use AWSCLI that is installed by default in the EC2 Linux Instances. This is compatible with Windows, Mac and Linux OS.
AWS –region us-east-one kms encrypt –key-id ‘alias/EncFSForEC2InternalStorageKey’ –plaintext “ThisIs-a-SecretPassword” –query CiphertextBlob –output text | base64 –decode > LuksInternalStorageKey.

aws s3 cp LuksInternalStorageKey s3:///LuksInternalStorageKey.

Next, type this command in the AWS CLI and replace — region with your region name. Ensure you have the right permissions to make keys and save in the S3 bucket. The file generated by the command is then copied to the S3 bucket. The alias key that makes it unique is EncFSForEC2InternalStorageKey.

4. Now, choose Encryption keys from the navigating pane of IAM Console. Select the key alias generated earlier add a new role that can access key. Later, scroll down to the Key Policy and choose Add.


Select the new role you created earlier. Next, click on Attach. This role is granted permission to use the key.

5. Finally, Launch a new instance in the EC2 console. On the Configure Instance Details, choose the IAM Role earlier.


You will see an Advanced Details section on the bottom pane. Paste this code in the User Data and choose As Text. This will execute at boot time of the EC2. #!/bin/bash

## Initial setup to execute on boot.

# Create an empty file. This file is used to host the file system.
# In this example we create a 2 GB file called secretfs (Secret File System).
dd of=secretfs bs=1G count=0 seek=2.
# Lock down normal access to the file.
chmod 600 secretfs.
# Associate a loopback device with the file.
losetup /dev/loop0 secretfs.
#Copy encrypted password file from S3. The password is used to configure LUKE later on.
aws s3 cp s3://an-internalstoragekeybucket/LuksInternalStorageKey.
# Decrypt the password from the file with KMS, save the secret password in.

LuksClearTextKeyLuksClearTextKey=$(aws –region us-east-1 kms decrypt –ciphertext-blob. fileb://LuksInternalStorageKey –output text –query Plaintext | base64 –decode).

# Encrypt storage in the device. cryptsetup will use the Linux.
# device mapper to create, in this case, /dev/mapper/secretfs.
# Initialize the volume and set an initial key.
echo “$LuksClearTextKey” | cryptsetup -y luksFormat /dev/loop0.
# Open the partition, and create a mapping to /dev/mapper/secretfs.
echo “$LuksClearTextKey” | cryptsetup luksOpen /dev/loop0 secretfs.
# Clear the LuksClearTextKey variable because we don’t need it anymore.
unset LuksClearTextKey.

# Check its status (optional).
cryptsetup status secretfs.
# Zero out the new encrypted device.
dd if=/dev/zero of=/dev/mapper/secretfs.
# Create a file system and verify its status.
mke2fs -j -O dir_index /dev/mapper/secretfs
# List file system configuration (optional).
tune2fs -l /dev/mapper/secretfs.
# Mount the new file system to /mnt/secretfs.
mkdir /mnt/secretfs.
mount /dev/mapper/secretfs /mnt/secretfs.

Remember to enable CloudTrail. This will help you monitor and audit accessibility to the KMS key. Launch the EC2 Instance. This copies the password file to S3 and then decrypted by the KMS. It then configures the encrypted file system mounted in mnt/secretfs.
Every file saved on the mount point will be encrypted when stored on the disk. Applications handling sensitive data will need to access the mount point to be able to use the encrypted file system. The rest of the file system other than of the mount is not encrypted.

Elsewhere, here is another article about Using APT tactics and techniques in your pentests.

PCI and MFA – what it means to you

PCI and MFA – what it means to you


I just finished reading the PCI Guru blog post about Multi-Factor Authentication. This is found in the Payment Card Industry Data Security Standard (PCI DSS). PCI Guru Jeffrey Hall explains that requirement 8.3.1 doesn’t go into effect until February 1st, 2018. However, it states that you should:

“Incorporate multi-factor authentication for all non-console access into the CDE for personnel with administrative access”

CDE stands for Cardholder Data Environment. Jeffrey in his blog post states that several organizations already have MFA implemented across the entire network. Therefore, they believe that they are already compliant. Furthermore, Jeffery cites a more particular snippet of requirement 8.3.1:

“If the CDE is segmented from the rest of the entity’s network, then an administrator needs to use multi-factor authentication when connecting to a CDE system from a non-CDE network. Multi-factor authentication is moreover implementable at the network level or at system/application level; therefore it does not have to be both. If the administrator uses MFA when logging into the CDE network, then, they do not also need to use MFA to log into a particular system or application within the CDE.”

Jeffrey further makes his point by saying:

“We need to remember what drove the development of requirement 8.3.1 was a lesson learned from the Target and similar breaches. In all of these breaches, system administrators were spear phished allowing the attackers to access the CDE in one way or another. Requirement 8.3.1 minimizes this threat by requiring MFA to gain access to the CDE. So even if an attacker obtains an administrator’s credentials or compromises an administrator’s system, that fact in and of itself would not compromise the CDE.

This is why the guidance for 8.3.1 puts the MFA border at the CDE. If you have MFA implemented in order to gain access to your network, how does that stop the threat of phishing? It does not. A spear phishing attack against such an MFA implementation defeats the MFA because it is already applied. The MFA in this scenario does not stop access to the CDE.”



Ok, as much as I’m someone that heavily uses 2-factor authentication as well as recommends it to customers please don’t think that it’s a silver bullet. Yes, attackers have bypassed 2-factor/MFA solutions for quite some time now.

Jeffrey, your recommendation isn’t wrong. However, the logic of the explanation on why to use it for the CDE is. Yes, especially relevant, tell the customer to implement MFA for the cardholder data environment.

Rather, explain and insist to the customer to use MFA to access the cardholder environment. We need not go knee deep explaining that if the attacker uses a spear-phishing attack against the MFA implementation, which, in addition, is applied to your entire network – that somehow, spear phishing will not affect the MFA implementation that is applied to the cardholder data environment


Besides, the council has released some guidance on MFA, and you can look at it here:

I certainly liked this article from a technical perspective:

Similarly, I liked this article most from an Assessor/Security professional’s point of view:

Particularly, I love how Adam Gaydosh finishes the blog post

“While the PCI Community meeting is always good for keeping up on the latest issues, we find that now more than ever, the PCI-DSS needs pragmatism. The debates over MFA are interesting from an academic perspective but offered little practical insight, other than the fact that folks are quite to argue a position without understanding the details. MFA is still one of the best ways to shrink an attack surface area and increase security.

This debate also shows how PCI, like any complex standard, quickly devolves into nitpicking debates over minutiae. This is particularly why hands-on technology experience is such an important skill for any PCI assessor. Finally, your auditor must have the ability to translate the intent of the standard into the operational realities of your environment.”