reply
Using the GPT-4 tokenizer (cl100k_base) yields 349,371 tokens.
Recent Google and Anthropic models do not have local tokenizers and ridiculously make you call their APIs to do it, so no idea about those.
Just thought that was interesting.