meguru-tokenizer

View on PyPIReverse Dependencies (0)

0.3.1 meguru_tokenizer-0.3.1-py3-none-any.whl

Wheel Details

Project: meguru-tokenizer
Version: 0.3.1
Filename: meguru_tokenizer-0.3.1-py3-none-any.whl
Download: [link]
Size: 14101
MD5: a132a01711db50ac322cae6c9b2ba93d
SHA256: db94b3f78b6d04ab65b033d91a5fdb266e0c0dbe39fa5b9383cbe7524eeb2d40
Uploaded: 2020-09-20 05:21:38 +0000

dist-info

METADATA

Metadata-Version: 2.1
Name: meguru-tokenizer
Version: 0.3.1
Summary: simple tokenizer for tensorflow 2.x and PyTorch
Author: MokkeMeguru
Author-Email: meguru.mokke[at]gmail.com
Home-Page: https://github.com/MokkeMeguru/meguru_tokenizer
License: MIT license
Keywords: tensorflow,pytorch,tokenizer,nlp
Classifier: Development Status :: 2 - Pre-Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Natural Language :: English
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Requires-Python: >=3.5
Requires-Dist: ginza (>=4.0.0)
Requires-Dist: sentencepiece
Requires-Dist: neologdn
Requires-Dist: nltk
Requires-Dist: spacy (>=2.2.4)
Requires-Dist: sudachidict-full
Requires-Dist: torch
Requires-Dist: tensorflow (>=2.2.0)
Description-Content-Type: text/markdown
[Description omitted; length: 2531 characters]

WHEEL

Wheel-Version: 1.0
Generator: bdist_wheel (0.34.2)
Root-Is-Purelib: true
Tag: py3-none-any

RECORD

Path Digest Size
meguru_tokenizer/__init__.py sha256=ipoZWIqt-ct9PabYbNnZrBskEkQlsQKzUNnJfcJ9BQ0 132
meguru_tokenizer/__version__.py sha256=mQkrGM2IugtKOb86B9fUMo1nPZ4sVH4EglwBV9dbUyM 87
meguru_tokenizer/base_tokenizer.py sha256=9yb_BhX3W1z0yJJnd5JAemcXPml7R9QxLf1JR5UDvJs 2704
meguru_tokenizer/sentencepiece_tokenizer.py sha256=5_irnFIQiKZF_Ywocr3Eu5P1ESHEuWJtVMi6kM_M9Rs 6882
meguru_tokenizer/sudachi_tokenizer.py sha256=KxmU7dUzRi8iM8-DdhVeE02lxaQeJ9t1vntstlyZ2mg 7498
meguru_tokenizer/vocab.py sha256=YrnIMfwdpXsa6OMjo8POtKHIC-TJqHc6krooUzoPnw0 4201
meguru_tokenizer/whitespace_tokenizer.py sha256=toa7XqanVkSIR8pYZzOxWA2CHcnLinwvUhODPxDDJ1c 5489
meguru_tokenizer/process/__init__.py sha256=1oLL20yLB1GL9IbFiZD8OReDqiCpFr-yetIR6x1cNkI 23
meguru_tokenizer/process/noise_pytorch.py sha256=Lvn1tWj5_4KJvTf6t5vIYbjrs3g-09oGMRP7LFjtoR0 2550
meguru_tokenizer/process/noise_tf.py sha256=bNfeM9MAqJvFh-rRR222hmwEVqu9E3t-3uTYig8KftA 6062
meguru_tokenizer-0.3.1.dist-info/METADATA sha256=-AHbRDUmo9hL_mpKYJZSg3a1BGMsuwiKrIqYzfsUz6k 3565
meguru_tokenizer-0.3.1.dist-info/WHEEL sha256=g4nMs7d-Xl9-xC9XovUrsDHGXt-FT0E17Yqo92DEfvY 92
meguru_tokenizer-0.3.1.dist-info/top_level.txt sha256=wU-LYeTzjTnFHdgMFsgqYggmyFUjx08_modxrkiNHIQ 17
meguru_tokenizer-0.3.1.dist-info/RECORD

top_level.txt

meguru_tokenizer