(1) Replacing sensitive data with a placeholder (the token) to eliminate fraud. For example, in an EMV-based credit and debit card systems, tokenization is the creation of a unique number for each transaction (the token), which is used in place of the customer's primary account number (PAN). See
EMV and
token.
(2) In AI training and inference, tokenization replaces text with mathematical representations. See
AI token and
AI training vs. inference.
(3) Representing a physical item in digital form. See
NFT.