Natural language processing is rooted in the “token”(道元), an ideographic unit that replaces consciousness in representing all things, embodying both abstract and concrete meanings. Tokens are central to AI algorithms, enabling the processing of text, images, video, and audio. Their generation and inference mirror Laozi’s principle: “Tao generates One, One generates Two, Two generates Three, Three generates all things.” Large language models use L’écart (间距) to deconstruct tokens into bits, learning from vast data sets to generate new tokens with evolving meanings. By incorporating probabilistic recombination, AI achieves unpredictability and uniqueness, akin to human behavior. Drawing on Laozi’s philosophy and the diversity of Chinese characters, AI can develop a more holistic and innovative framework.