First minimum viable module - word tokenization
Pre-release
Pre-release
Features
- Newmm Word Segmentation
- Custom Dictionary support
- Published on PyPi
Installation
pip install pythainlp-rust-modules
How to use
from oxidized_thainlp import segment, load_dict