Tokenization

A relatively newer solution for removing sensitive data from business processes, applications, and user interaction is tokenization. This method is commonly presented as a solution to reduce PCI DSS scope and reduce business risk associated with storing credit card numbers. Tokenization is the process of generating a representation of data, called a token, and inserting the token into the processes where the original data would be used. A database is used to map the original data to the token value, allowing for both values to be retrieved if needed, and to maintain a real value for the token.

An example is when credit card numbers are inserted at point of sale and then sent on for authorization. Once authorization occurs there are ...

Get Enterprise Security: A Data-Centric Approach to Securing the Enterprise now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.