spacy-html-tokenizer

View on PyPIReverse Dependencies (0)

0.1.3 spacy_html_tokenizer-0.1.3-py3-none-any.whl

Wheel Details

Project: spacy-html-tokenizer
Version: 0.1.3
Filename: spacy_html_tokenizer-0.1.3-py3-none-any.whl
Download: [link]
Size: 5452
MD5: e0d505891e011bc9bb6d348ee6f350ed
SHA256: f15cd3b1967949d0730eb41701afc5eebe672224ff82af3d54317063caeba3ac
Uploaded: 2022-03-17 13:47:30 +0000

dist-info

METADATA

Metadata-Version: 2.1
Name: spacy-html-tokenizer
Version: 0.1.3
Summary: An HTML-friendly spaCy tokenizer
Author: Peter Baumgartner
Author-Email: 5107405+pmbaumgartner[at]users.noreply.github.com
Home-Page: https://github.com/pmbaumgartner/spacy-html-tokenizer
Project-Url: Repository, https://github.com/pmbaumgartner/spacy-html-tokenizer
License: MIT
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Requires-Python: >=3.7,<4.0
Requires-Dist: selectolax (<0.4.0,>=0.3.6)
Requires-Dist: spacy (<4.0.0,>=3.2.2)
Description-Content-Type: text/markdown
[Description omitted; length: 5680 characters]

WHEEL

Wheel-Version: 1.0
Generator: poetry 1.0.7
Root-Is-Purelib: true
Tag: py3-none-any

RECORD

Path Digest Size
spacy_html_tokenizer/__init__.py sha256=gSITR5t7uMvfYSy9Je9Si6szdTE3VM-Fojx9PlHadso 73
spacy_html_tokenizer/html_tokenizer.py sha256=b8AzCYsYwMx0zoUOiddx8zDmi0kkXXOINz-tSlaaxRA 2293
spacy_html_tokenizer-0.1.3.dist-info/entry_points.txt sha256=ykwAs6e0EC3wglpFCnp34Qa4afrPAkSm46ddKKZMHNg 93
spacy_html_tokenizer-0.1.3.dist-info/LICENSE sha256=PNtBVnrxtP_bqLJjbiRvk7WQzprgEeNxz832MfKpsk0 1056
spacy_html_tokenizer-0.1.3.dist-info/WHEEL sha256=y3eDiaFVSNTPbgzfNn0nYn5tEn1cX6WrdetDlQM4xWw 83
spacy_html_tokenizer-0.1.3.dist-info/METADATA sha256=DQst927-cktbxYUdbLt97PRNbHtioK9cEuEyLyAiAAM 6494
spacy_html_tokenizer-0.1.3.dist-info/RECORD

entry_points.txt

html_tokenizer = spacy_html_tokenizer.html_tokenizer:create_html_tokenizer