We've talked a lot about how tokenization can reduce PCI scope. As discussions in the industry focus on the future of data security and PCI compliance, I want to revisit this topic using some thoughts from Rob McMillon at RSA for the foundation.
In a post about this topic in May, we defined tokenization as a technology that replaces sensitive cardholder data with a randomized token that represents the cardholder data. We also talked about how tokenization eliminates a merchant’s storage of actual cardholder data. From a merchant’s perspective, if the cardholder data is never stored, it’s far less likely to be stolen. In addition, a large portion of a merchant’s computer systems are removed from the scope of a PCI DSS compliance audit since those systems no longer process or store cardholder data.
Rob talks about a couple things that are very interesting. The one I would like to focus on is the fact that using tokens reduces the size of the card data environment. The important part of the definition of tokens is that there is no direct relationship between a credit card number and the associated token.
According to Verisign, the leading reason why companies fail their PCI assessment is failure to protect cardholder data. Every computer system along with every application that uses or stores sensitive card data is part of the overall CDE and in scope for PCI. Add tokenization into the equation, and you shrink the card data environment, as any system that uses tokens in lieu of credit card data is not a part of the merchant’s card data environment. This is very powerful for merchants as in reducing PCI scope it also decreases the costs associated with PCI. We'll certainly continue to hear more out there on this topic, and I look forward to more and more merchants benefiting from tokenization.