hebrew-tokenizer

View on PyPIReverse Dependencies (1)

1.1.0 hebrew_tokenizer-1.1.0-py3-none-any.whl

Wheel Details

Project: hebrew-tokenizer
Version: 1.1.0
Filename: hebrew_tokenizer-1.1.0-py3-none-any.whl
Download: [link]
Size: 7725
MD5: bbc566e7dd83483060fd67f39adeefc9
SHA256: 2b64ca7d2e6e7d6c477d2f9a17bc1430c72dc265c473824c3ad5ce80b34da712
Uploaded: 2020-06-11 14:41:48 +0000

dist-info

METADATA

Metadata-Version: 2.1
Name: hebrew-tokenizer
Version: 1.1.0
Summary: A very simple python tokenizer for Hebrew text
Author: Yonti Levin
Author-Email: therealyontilevin[at]gmail.com
Home-Page: https://github.com/yontilevin/hebrew_tokenizer
License: MIT
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python
Classifier: Operating System :: OS Independent
Requires-Python: >=2.6, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*
[Description omitted; length: 30 characters]

WHEEL

Wheel-Version: 1.0
Generator: bdist_wheel (0.34.2)
Root-Is-Purelib: true
Tag: py3-none-any

RECORD

Path Digest Size
hebrew_tokenizer/__init__.py sha256=KTpaz5-FVgm83gnBpPUNDH4ZaJ6PGwXqg3cwDj3SQt4 123
hebrew_tokenizer/api.py sha256=V2YkklJgh3m-t3_3PKI0v_TK6357IWdVd8Te3dknVTg 404
hebrew_tokenizer/groups.py sha256=n7A5Qh_4pvOoVtwJeJospns1e3ZPItkkxZQ7ThoGuSQ 231
hebrew_tokenizer/lexicon.py sha256=SxAI_4anEiG053o8V3uX-yp7iX1mizL716_iSCjkGcw 1652
hebrew_tokenizer/tokenizer.py sha256=C68_3Fpw2zOdiHH9CLfTUPjU3e70Z8_1K1xrYcofgE4 2738
tests/__init__.py sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU 0
tests/tokenizer_tests.py sha256=Sm-yb3J7k7qKVSjavGfJVXOsSkEOjR6FlNwe2uXCngs 11941
hebrew_tokenizer-1.1.0.dist-info/LICENSE sha256=KcxVaw4BJ1UvtF6BPAGY_wP943lqU5U5Nnu059MtRq4 1068
hebrew_tokenizer-1.1.0.dist-info/METADATA sha256=GFqHDtlFOzY1UXVZ4wzVny1p1ZmYk4o84MnQ8Gr0WbA 498
hebrew_tokenizer-1.1.0.dist-info/WHEEL sha256=g4nMs7d-Xl9-xC9XovUrsDHGXt-FT0E17Yqo92DEfvY 92
hebrew_tokenizer-1.1.0.dist-info/top_level.txt sha256=inGwCjllcsPHNs7yjcd-WgztuWzOY921CnPSJryuZUQ 23
hebrew_tokenizer-1.1.0.dist-info/RECORD

top_level.txt

hebrew_tokenizer
tests