tokenizer-hub

View on PyPIReverse Dependencies (0)

0.0.1 tokenizer_hub-0.0.1-py3-none-any.whl

Wheel Details

Project: tokenizer-hub
Version: 0.0.1
Filename: tokenizer_hub-0.0.1-py3-none-any.whl
Download: [link]
Size: 13851
MD5: 60c7e39ed4ded7ad29f08410fa32d87f
SHA256: d22b00d0890736d21b983b201ba4b25532c982fdb64bf610281ede7802872812
Uploaded: 2018-08-07 06:45:51 +0000

dist-info

METADATA

Metadata-Version: 2.1
Name: tokenizer-hub
Version: 0.0.1
Summary: Yoctol Natural Language Tokenizer
Author: Solumilken
Home-Page: https://github.com/Yoctol/tokenizer-hub
License: MIT
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Requires-Python: >=3.5
Requires-Dist: jieba (==0.39)
Requires-Dist: nltk (==3.3.0)
Requires-Dist: purewords (==0.1.1)
[Description omitted; length: 232 characters]

WHEEL

Wheel-Version: 1.0
Generator: bdist_wheel (0.31.1)
Root-Is-Purelib: true
Tag: py3-none-any

RECORD

Path Digest Size
tokenizer_hub/__init__.py sha256=PsUxP5qN9iHZgZIuYedWpv9NcZuv6jDn5jDR9-H50Ao 846
tokenizer_hub/__version__.py sha256=sXLh7g3KC4QCFxcZGBTpG2scR7hmmBsMjq6LqRptkRg 22
tokenizer_hub/add_words.py sha256=JVUqVFFDpMLprSsTCiadbj__1pxXamjSNWCCkWSQYdg 994
tokenizer_hub/base_tokenizer.py sha256=0sT4pJFviEz9XhdsYYlNpfKcCOLS1ZNN5gs0FLZyaJg 765
tokenizer_hub/chinese_char_tokenizer.py sha256=PYMhBLVM0XjopLnv7PhHf8Y9hF1EW5ECDOY00etnA8g 1393
tokenizer_hub/custom_jieba_tokenizer.py sha256=Zg8zyyNihEuMsHIUlLrT3e7z-ftheafq1y3MEWq4FoA 2739
tokenizer_hub/nltk_custom_jieba_tokenizer.py sha256=r0TAI05xxLDOXtWxhP873mcw0j0PVzcNS9Z-zVgIYUo 2938
tokenizer_hub/nltk_tokenizer.py sha256=_Ime97_B8nNKl3Qdip2OVFhlNSgXd3li_78BqngOidc 1316
tokenizer_hub/parallel_jieba_tokenizer.py sha256=Y4msOGfF9saOu970xem_-1RHXDaINYzyffoCg9Minkk 781
tokenizer_hub/pure_char_tokenizer.py sha256=I8ajypjmCZNUdIwnvHYHGixbAz61frNPqfXwq_tl384 1923
tokenizer_hub/purewords_tokenizer.py sha256=-EhfK2nLFlN85q2a8Ah8vseAfAXmxDM98K8C8nTNZkA 2200
tokenizer_hub/tests/__init__.py sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU 0
tokenizer_hub/tests/test_chinese_char_tokenizer.py sha256=5N5IyN9PmoD4Qu_6BLE97uAz2jrrNexYWVY5QCGk8Ig 2891
tokenizer_hub/tests/test_custom_jieba_tokenizer.py sha256=Z2uPFSA35JPCv4YpILzs1JOKtXAOka2fnkImg_coOM0 3656
tokenizer_hub/tests/test_nltk_custom_jieba_tokenizer.py sha256=NA2j3k6emyqFqTHaKGj_RVaN7_ZzSidRZ7DpiqZhJUc 2731
tokenizer_hub/tests/test_nltk_tokenizer.py sha256=T10t-Lvck_LRpmOsaOS0SfXHJZ6qZXPaPPafhkQS7uM 2666
tokenizer_hub/tests/test_pure_char_tokenizer.py sha256=PEwlrj2X-Kh3D_oQK_291fu-v_880bucQNPn-FfviEU 3330
tokenizer_hub/tests/test_purewords_tokenizer.py sha256=ttEZgqjjpyuK7g_WPqvq9ZpaBfzyCi8iXtuGSkErJCE 1309
tokenizer_hub-0.0.1.dist-info/METADATA sha256=JJqZZLmED7KwbP5BnI-bxPgGyI8vPntOTolDZnQEf84 745
tokenizer_hub-0.0.1.dist-info/RECORD
tokenizer_hub-0.0.1.dist-info/WHEEL sha256=NzFAKnL7g-U64xnS1s5e3mJnxKpOTeOtlXdFwS9yNXI 92
tokenizer_hub-0.0.1.dist-info/top_level.txt sha256=f5IRIFUGivJyZEMpGhhrD6kn7Td_7zmd4hhitqoksmg 14

top_level.txt

tokenizer_hub