Skip to main content

Yoctol Natural Language Tokenizer

Project description

# tokenizer-hub

yoctol 乂卍oO煞氣ㄟtokenizerOo卍乂

Tokenizers have the same interface of Jieba:

`python from tokenizer_hub import XXX_tokenizer tokenizer = XXX_tokenizer() tokenizer.lcut('我来到北京清华大学') > ['我', '来到', '北京', '清华大学'] `

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page