FreeBSD.software
Home/textproc/py311-tokenizer

py311-tokenizer

3.5.2textproc

Tokenizer for Icelandic text

Tokenizer: A tokenizer for Icelandic text Tokenization is a necessary first step in many natural language processing tasks, such as word counting, parsing, spell checking, corpus generation, and statistical analysis of text.

$pkg install py311-tokenizer
github.com/mideind/Tokenizer
Origin
textproc/py-tokenizer
Size
625KiB
License
MIT
Maintainer
otis@FreeBSD.org
Dependencies
1 packages
Required by
0 packages

Dependencies (1)