Huggingface Tokenizers (https://github.com/huggingface/tokenizers), which are now used by default in their Transformers Python library, use pyO3 and became popular due to the pitch that it encoded text an order of magnitude faster with zero config changes.
It lives up to that claim. (I had issues with return object typing when going between Python/Rust at first but those are more consistent now)
It lives up to that claim. (I had issues with return object typing when going between Python/Rust at first but those are more consistent now)