Tokenization
Tokenization is the process of removing sensitive data from your business systems by replacing it with an undecipherable token and storing the original data in a secure location with restricted access. In doing so, sensitive data (which could also be personal GDPR data) does not exist within the various systems, applications and networks in your organization and hence you do not need to worry about how to protect it. Tokens do not need protection. Only the original data, which the token corresponds to needs to be protected. Tokenization differs from encryption because when sensitive data is encrypted, it is still stored and accessed within applications and systems and hence continues to require protection.
Tokenization is a very useful technology and widely used to comply with PCI DSS and GDPR requirements. Amongst other uses, tokenization is used for:
- PCI scope reduction
- GDPR compliance
- Risk reduction – sensitive data removed
- Facilitates use of de-identified data in business systems
- Support for multiple data sets
- Protection for data in transit and at rest
- No key management
- Mathematically unrelated to original data
- Multi-use tokens
- Single-use tokens
- Format preserving tokens
- Custom token formats