Tokenization is the process of removing sensitive data from your business systems by replacing it with an undecipherable token and storing the original data in a secure location with restricted access. In doing so, sensitive data (which could also be personal GDPR data) does not exist within the various systems, applications and networks in your organization and hence you do not need to worry about how to protect it. Tokens do not need protection. Only the original data, which the token corresponds to needs to be protected. Tokenization differs from encryption because when sensitive data is encrypted, it is still stored and accessed within applications and systems and hence continues to require protection.
Tokenization is a very useful technology and widely used to comply with PCI DSS and GDPR requirements. Amongst other uses, tokenization is used for:
- PCI scope reduction
- GDPR compliance
- Risk reduction – sensitive data removed
- Facilitates use of de-identified data in business systems
- Support for multiple data sets
- Protection for data in transit and at rest
- No key management
- Mathematically unrelated to original data
- Multi-use tokens
- Single-use tokens
- Format preserving tokens
- Custom token formats
FAQs
What is the difference between On Premise and Cloud Based Tokenisation?
On Premise Tokenisation provides more control over how tokenization is applied but it provides limited scope reduction. It also presents a higher risk and cost as sensitive data is still resident in your environment. With Cloud-Based tokenization you benefit from a reduction in scope and reduced risk since sensitive data is removed from your environment. It provides Platform-focused security and lowers. The costs that would otherwise be required to maintain the system such as cyber insurance, PCI audits, technical maintenance, etc.
Why do I need tokenization if I use encryption throughout?
Encryption is a very useful method for securing data in transit and in storage. However, the data persists on the systems, applications and networks that use it. Since the data is present, it cannot be ignored or considered to be out of scope. Tokenization removes the data and replaces it with a meaningless token. Systems, networks and applications consume the token and not the data.