skip to navigation
skip to content

Not Logged In

tinysegmenter3 0.0.3

Super compact Japanese tokenizer


TinySegmenter -- Super compact Japanese tokenizer was originally created by
(c) 2008 Taku Kudo for javascript under the terms of a new BSD licence.
For details, see [here]

tinysegmenter for python2.x was written by Masato Hagiwara.
for his information see [here]

This tinysegmenter is modified for python3.x and python2.x for distribution by Tatsuro Yasukawa.

See info about [tinysegmenter]


pip install tinysegmenter3


from tinysegmenter import TinySegmenter
segmenter = TinySegmenter()
statement = '私はpython大好きStanding Engineerです.'
tokenized_statement = segmenter.tokenize(statement)
# ['私', 'は', 'python', '大好き', 'Standing', ' Engineer', 'です', '.']
File Type Py Version Uploaded on Size
tinysegmenter3-0.0.3.tar.gz (md5) Source 2014-07-14 10KB
  • Downloads (All Versions):
  • 0 downloads in the last day
  • 29 downloads in the last week
  • 119 downloads in the last month