The Limits of Payment Tokenization

by | Nov 19, 2014 | Payments | 0 comments

Kill Bill is a open-source subscription billing and payments platform. Fully extensible, you can build your business logic on top of it for a customized billing and payments solution.

Since the introduction of ApplePay, payment tokenization has become a very hot topic. Payment tokenization has been in place for a long time by merchants (either using in house solution or relying on third party system like payment gateways, …) in order to comply with PCI DSS rules.

However, ApplePay’s tokenization (which relies on EMVCo) relies on a different form of tokenization, where tokens are created very early in the payment chain and where merchants only see the tokens. There are some theoretical advantages of this new form of tokenization, some practical advantages but also some serious limitations:


The advantages are:

  • With the rise of mobile payments, tokens can be used instead of real PAN and so in the eventuality where the device gets compromised, those tokens can be inactivated and replaced right away, leading a much easier consumer experience,

  • The EMVCo specification states that tokens can be attributed to only work for a specific merchant decreasing their intrinsic value when they get compromised; however, note that this is already not the case for ApplePay’s tokens which are global across merchants since Apple plays the role of a Token Requester.

The Limitations are:

  • In theory, in a world where a merchant only sees tokens, there is no (or less) need to build costly PCI compliance infrastructure and risk of breach is also reduced. In practice, since merchants will probably have to deal with both tokens and credit card numbers, this is just a theoretical advantage, as they need to continue to implement their own vault or delegate to third party providers (payment gateways, …)
  • Those tokens are what we call ‘high value tokens’, since they can be used to perform transactions, and therefore can be monetized by fraud rings. This is especially true when tokens are not associated to a specific merchant. The PCI token specification clearly states that “Additionally, tokens that can be used to initiate a transaction might be in scope for PCI DSS, even if they cannot directly be used to retrieve PAN or other cardholder data

  • Cryptograms. Along with the token, each transaction may be associated with a cryptogram, which should be a ‘transaction-unique value‘ . So in theory a breach of tokens would not have a lot of value unless the attacker was able to also create additional cryptograms for future transactions. The interesting part, is that most processors allow ‘recurring’ transactions, and in those cases, they only require the cryptogram to be associated with the first initial transaction (for e.g see page 10). This mechanism, needed to reduce friction and/or to allow merchants to do recurring billing  breaks the security feature associated with the cyrptograms.

  • Orthogonal to the tokenization comes the encryption. Unless there is a strong encryption right at the start, there is always a possibility for a breach prior the tokenization happens; fraudsters already compromised some POS system to grab the CC data prior the tokenization happened.
  • Then comes the question of the liability shift; that seems backward… “If a contact chip card is presented to a merchant that has not adopted, at minimum, contact chip terminals, then liability for counterfeit fraud may shift from the card issuer to the merchant’s acquirer“. So, merchant is penalized if he does not spend the extra money to upgrade his system, but what are the benefits if he does?

In addition to those limitations, there is a big question about who (which player in the payment chain) will end up owning the precious CC consumer data? The CC Networks (behind the EMVCo specification) along with TCH (which also has a proposal for tokenization) are pushing hard and waving the security, simplicity cards associated with that effort. But is that really what it is about?

Subscribe to our blog: