PCI Scope Reduction by Using Tokenization

Tokenization techniques are rapidly evolving to address PCI scope reduction efforts and securing cardholder data from breaches.

PCI scope reduction is integral in simplifying PCI compliance and reducing risk overall in the environment. Scope reduction effectively minimizes attack surface area and limits the number of systems that must be assessed to the PCI standards. Regardless of how the payments are made, assuming credit cards are accepted, there will always be the issue of PCI scope and maintaining PCI compliance.

Scope Reduction Potential of Tokenization

Tokenization solutions do not eliminate the need to maintain and validate PCI DSS compliance, but they may simplify a merchant’s validation efforts by reducing the number of system components for which PCI DSS requirements apply. – A quote from PCI SSC

Replacing sensitive cardholder data, such as the card numbers (PAN), in business systems with tokens eliminates the possibility of a hacker to gain access to database tables containing both the customer information and their corresponding PAN. When utilizing tokenization, a hacker will only be able to access customer information and meaningless tokens in the event of a breach. In common tokenization implementations, PANs will be stored securely in a Vault system that contains encrypted credit card information. Therefore, the only theoretical meaningful attack is to breach the tokenizing system and lookup the PAN for each token or to induce the system into responding to a query for PAN. Both of these techniques are extremely difficult for a hacker to achieve. If the tokenization system is properly segmented, then the tokenization system and the systems that “connect” to it are the only systems that are in scope. This can make PCI compliance a much easier task. This in turn is the reason why scope reduction is possible with respect to PCI.

How Tokenization Works

When cardholder data is received by the merchant and needs to be stored, the system initially processing the data makes a call to a tokenization system supplying the cardholder data as parameters. The tokenization system may be local or remote to the merchant’s systems. A pseudo call might look like this:

Token= CallFunctionGiveMeaToken(PANnumber);

The tokenization system returns a token to the calling system, which is subsequently used by the caller, based on the PANnumber parameter passed into its code titled CallFunctionGiveMeaToken. The tokenization system takes the sensitive data and either stores it and associates the storage cell with the token (called “vault” storage) or the tokenization system can use an encrypted value of the sensitive data as the token.

To retrieve cardholder data, a system can make a query to the tokenization systems by providing the token parameter. The tokenization system then returns the sensitive data. Either the data is retrieved from the secured storage of the tokenization system or the tokenization system can decrypt the token, derive the data and return the data to the calling system.

Tokenization systems are complex and secure operation is key to success in tokenization. Clearly, any system that queries for the sensitive data must be authorized and authenticated. Additionally the system, communications channels, querying system, encryption, key rotation and data store must be secure. Many security factors must be properly addressed beyond just the application logic. Therefore, most tokenization systems are produced and managed by third parties that specialize in creating these systems.

Tokenization Concepts

  • Tokenizationreplaces cardholder data records with a “token” that is used to refer to the sensitive data that is then stored encrypted in a separate, secured location called a “vault” or the token represents the data encrypted and is stored by the systems that use tokens.
  • Vault Based Tokenization –stores encrypted data that can be indexed by corresponding tokens
  • Encryption Based Tokenization –Mathematically reversible tokenization provides a secured function that can encrypt the PAN and return a token or accept a token and return the PAN data. No sensitive data or encrypted data is stored. The tokenization system is an “encryption engine”.
  • Tokenization Reduces Cardholder Data Exposure:
    • Tokens can be used in most areas and predominantly in lieu of cardholder PAN data.
    • All other systems use a token in lieu of the sensitive cardholder data. This greatly reduces the attack surface area.
    • Cardholder data may still be gathered and processed when inputted by the customer, but a token is used thereafter as an alias for cardholder data until the actual sensitive data is needed again.
    • Cardholder data may be retrieved from the tokenization system for business operations.
    • If a vault is used it holds the most sensitive data and thereby becomes the primary attack target. Because of the need to be highly secure and very carefully designed vaults are often out-sourced to teams that are dedicated to implementing this technology.
    • The communications with the vault must be secure, authenticated and authorized.
    • The risk surface areas are reduced greatly, but still exist.
    • The PCI Scope is all systems that may ever process a card (input, storage, retrieval, etc.) and the systems that can “connect” into the cardholder data environment making network segmentation a key component of tokenization. Segmentation is still the key to success in reducing PCI scope.
    • Applications will need to be adjusted to handle tokens in lieu of cardholder data.
    • Cardholder data retrieval processes and procedures will need to be defined and implemented.
    • Rekeying tokens must be a capability to be PCI compliant. This can be very difficult technically.
    • If using multiple “vault” architectures, duplicate token issues must be considered (where two tokenizing systems issue the same token for different data which then causes problems).
    • Switching tokenization systems can be extremely time consuming and costly. Once a tokenization system is chosen it is very difficult to switch it out.
    • A “single use token” represents PAN data in a single transaction. A multi-use token represents PAN, across multiple transactions
    • Per PCI DSS you cannot tokenize CVV, track or PIN data and be compliant.
    • Tokens cannot look like PAN. They can be the same length but are not allowed to pass a Luhn check.
    • A tokenization system must have a method for deleting old data (tokens and PAN) to be in line with PCI data retention requirements.

Tokenization Implementation Types

  1. Local Merchant Tokenization– cardholder data is stored by the merchant – not the tokenization service provider’s systems.
  • PAN data can be retrieved by a system by submitting the token to the local tokenization API through a “request” that is integrated with the merchant applications.
  • API calls must be authenticated and authorized properly and performed over a secured internal communications channel.
  • Local tokenization provides more merchant control over the processes and cardholder data storage.
  • Communication outside the organization is not required thereby reducing risk and exposure.
  • The ideal architecture is to have one tokenization system to limit risk areas and secured information synchronization is simplified.
  • The merchant’s development processes are considered in scope.
  • The merchant can change the tokenization systems at will.
  1. Tokenization Service Provider (TSP)– utilizes an external service provider to store and process the PAN.
  • Upon submittal of cardholder data, the external tokenizing entity returns a “token” (which is often called a “Reference ID”) back to the business entity that needs to be able to reference the cardholder data, but does not want to store it.
  • The merchant or business entity uses the token in their applications.
  • For later business operations, the business entity can optionally request the cardholder data by submitting the token in a request to the remote tokenizing service provider.
  • Communication outside the organization to the TSP is required and must be secure.
  • Authentication and authorization must be secure and adds complexity.
  • The merchant has to monitor and adapt to the third party tokenization system changes and updates.
  • The TSP must provide good implementation and management documentation.
  • The merchant is locked into the third party, because it is very difficult to change tokenization systems once the merchant’s systems are replete with tokens in lieu of cardholder data.

Tokenization Conclusions

Reducing PCI scope by reducing the cardholder data attack surface through tokenization is recommended. All this must still be used in tandem with proper network segmentation to yield great PCI efficiencies and a more secure network. Although tokenization systems are complex, they usually have an exceptional return on investment due to reducing PCI compliance effort all the while of minimizing loss in the event of a breach. Finally, it sure would be nice to only have to worry about a few systems at night opposed to an entire data center.

Written by: John Elliott
John has been a security consultant and security software programmer since graduating with honors from the University of Colorado. Since joining DirectDefense, John provides world-class security consulting services to DirectDefense clients while providing development skills for our application assessment services.

Prior to joining DirectDefense, John worked for Accuvant and Trustwave LABS. While there, he focused his efforts on application security by performing comprehensive PA-DSS application assessments, code reviews and PCI DSS assessments. Before that John has worked for Hewlett Packard, PentaSafe (acquired by NetIQ) and Peregrine Systems designing and developing security tools.

Prev
Next
Shares