gpt3-tokenizer

View on PyPIReverse Dependencies (0)

0.1.5 gpt3_tokenizer-0.1.5-py2.py3-none-any.whl

Wheel Details

Project: gpt3-tokenizer
Version: 0.1.5
Filename: gpt3_tokenizer-0.1.5-py2.py3-none-any.whl
Download: [link]
Size: 567843
MD5: be14a8564759115669651d1c681dd8db
SHA256: 2d0ed9c7efa907d45ce3c338ffe2ee3bc9124ee1236248989bd883fd4eb0e5b6
Uploaded: 2024-04-26 18:07:31 +0000

dist-info

METADATA

Metadata-Version: 2.1
Name: gpt3_tokenizer
Version: 0.1.5
Summary: Encoder/Decoder and tokens counter for GPT3
Author: Alison Ferrenha
Home-Page: https://github.com/alisonjf/gpt3-tokenizer
Project-Url: Repository, https://github.com/alisonjf/gpt3-tokenizer
License: MIT
Keywords: openai,gpt,gpt-3,gpt3,gpt4,gpt-4,tokenizer
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Requires-Python: >=2.7, !=3.0.*, !=3.1.*, !=3.2.*
Requires-Dist: future (<0.19.0,>=0.18.3)
Requires-Dist: regex (==2021.11.10); python_version < "3"
Requires-Dist: regex; python_version >= "3"
Requires-Dist: six (<2.0.0,>=1.16.0)
Description-Content-Type: text/x-rst
[Description omitted; length: 968 characters]

WHEEL

Wheel-Version: 1.0
Generator: poetry-core 1.9.0
Root-Is-Purelib: true
Tag: py2.py3-none-any

RECORD

Path Digest Size
LICENSE sha256=8cCJ9b-IqQRfUNnMbOMsMOAIhXNYp6OYQnh_kZIAVuA 1082
gpt3_tokenizer/__init__.py sha256=iiObuCOyM7w2Vaj-UzB6B13ddmGtfLI7k6lq1hvHA6c 125
gpt3_tokenizer/_entry.py sha256=mPO44CeVt5jgMUodR8A6hkBZKXniQyTbxFromVtLGsA 1869
gpt3_tokenizer/_functions.py sha256=VioV0Q1gryb9fZ_U_ALIpuOYdT03N6NxQKJtEofpfhw 3279
gpt3_tokenizer/data/encoder.json sha256=GWE5ZovmPztdZXRCcxeugvYSqXxdHNrzbtIlbb9jZ4M 1042301
gpt3_tokenizer/data/vocab.bpe sha256=HOFmR3PFDz4MyIQmGak-3EYkUltyixiKngvjO3cmrcU 456318
gpt3_tokenizer-0.1.5.dist-info/LICENSE sha256=8cCJ9b-IqQRfUNnMbOMsMOAIhXNYp6OYQnh_kZIAVuA 1082
gpt3_tokenizer-0.1.5.dist-info/METADATA sha256=qhX8y9TGHtGQK-IubG59njMDmiWO1drkjvUtspSl9Is 2301
gpt3_tokenizer-0.1.5.dist-info/WHEEL sha256=IrRNNNJ-uuL1ggO5qMvT1GGhQVdQU54d6ZpYqEZfEWo 92
gpt3_tokenizer-0.1.5.dist-info/RECORD