Tokenization

Tokenization is the process by which an AI Model separates characters into processable units.