Upload files or paste text to count tokens and visualize context window usage. by Damian

Click to upload or drag and drop

Multiple files or folders

Tokens
0
Characters
0
Words
0
File Size
0 B
Token Usage (max 200,000) 0%
Tokens per character: 0
Tokens per word: 0
Note: Token counts use GPT-4's tokenizer (cl100k_base). Prices are estimates and may change.