Attention-and-Transformers

View on PyPIReverse Dependencies (0)

0.0.15 Attention_and_Transformers-0.0.15-py3-none-any.whl

Wheel Details

Project: Attention-and-Transformers
Version: 0.0.15
Filename: Attention_and_Transformers-0.0.15-py3-none-any.whl
Download: [link]
Size: 24772
MD5: e91cb98da61973197058849f34b4c2c8
SHA256: a32c67a0fcb200627baad4f66e7bcec4edc96771f1faf67d7af1c669ce139ae3
Uploaded: 2022-12-17 19:13:10 +0000

dist-info

METADATA

Metadata-Version: 2.1
Name: Attention-and-Transformers
Version: 0.0.15
Summary: Building attention mechanisms and Transformer models from scratch. Alias ATF.
Author: Vaibhav Singh
Author-Email: vaibhav.singh.3001[at]gmail.com
Home-Page: https://github.com/veb-101/Attention-and-Transformers
License: Apache 2.0
Keywords: tensorflow keras attention transformers
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Topic :: Scientific/Engineering
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Software Development
Classifier: Topic :: Software Development :: Libraries
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Requires-Python: >=3.7,<3.11.*
Requires-Dist: tensorflow-datasets
Requires-Dist: livelossplot
Requires-Dist: Pillow
Requires-Dist: opencv-contrib-python
Requires-Dist: pandas
Requires-Dist: scikit-learn
Requires-Dist: matplotlib
Requires-Dist: scikit-image
Requires-Dist: tensorflow-addons; platform_machine != "aarch64" and platform_machine != "aarch32"
Requires-Dist: tensorflow (>=2.10.0); platform_system != "Darwin"
Requires-Dist: tensorflow-macos; platform_system == "Darwin"
Description-Content-Type: text/markdown
License-File: LICENSE
[Description omitted; length: 4163 characters]

WHEEL

Wheel-Version: 1.0
Generator: bdist_wheel (0.37.1)
Root-Is-Purelib: true
Tag: py3-none-any

RECORD

Path Digest Size
ATF/__init__.py sha256=jUOrjx_PMktRCIZ1Tz9FdYM2T6U3VhfiQWDyUNF8wew 40
Attention_and_Transformers/__init__.py sha256=qPU_uia_tudbpjaccLl37iR-QixdzAVSdrsgCwVea6Q 148
Attention_and_Transformers/MobileViT_v1/BaseLayers.py sha256=MAf7RWs2RdQipMFP4-VkYDK-kbH3GirwZABjGnbwEU8 2802
Attention_and_Transformers/MobileViT_v1/__init__.py sha256=CP5nSs_c1rU8q8433vkQUELTO1j6J7NN9DfiqTKIcoc 521
Attention_and_Transformers/MobileViT_v1/mobile_vit_v1.py sha256=Tks4G2xMxauUM4Odi0etQEfPJ9U31UsuIP4ZasUQLeo 6825
Attention_and_Transformers/MobileViT_v1/mobile_vit_v1_block.py sha256=JSO-SdWodNWY6JJGA3KqbX95Gbc0XBqhnwFXuDULlXk 7676
Attention_and_Transformers/MobileViT_v1/multihead_self_attention_2D.py sha256=RrFumLwAR9uTU9MApjuOnhSsfqJRb_OVw5ayzARA1cM 4776
Attention_and_Transformers/MobileViT_v1/utils.py sha256=V6H_Jb5XojlE2Mo2YpkMYfwjkJWkYX4Zi3GdnJsgQWs 523
Attention_and_Transformers/MobileViT_v2/BaseLayers.py sha256=4HvVP4rMe28IVJii823HNY1RoPTbGr3X5g-Sob6zuww 2802
Attention_and_Transformers/MobileViT_v2/__init__.py sha256=gx3xZRsw5zcOVjAFh_576Mz_ToT1k9kEkz4t773ot7s 393
Attention_and_Transformers/MobileViT_v2/linear_attention.py sha256=UUghD5ojBr8ALuvyPOqVwHbJQ2Uh6xL9247ZLnAGgnQ 4400
Attention_and_Transformers/MobileViT_v2/mobile_vit_v2.py sha256=iJbf5sV3A8BAkASelr1Pr7m4s-zumFP9DSysIvEUo6E 8149
Attention_and_Transformers/MobileViT_v2/mobile_vit_v2_block.py sha256=fCPogz690CkQJ4ZZJbQ7tgW2txeFPrZXsmpB_bioi3s 7270
Attention_and_Transformers/MobileViT_v2/utils.py sha256=p7vYAqIRceS9BF4FHbXMONp3u92yDoV3OtMceI0k4Bk 674
Attention_and_Transformers/ViT/__init__.py sha256=AECP3aidwlKyhH1RONmrWDeQfvWJKYszOEwnj-zUUb4 202
Attention_and_Transformers/ViT/multihead_self_attention.py sha256=_hSH2XKX9bsvDeJsxp4WxD_GkRMQbFoK25fXP0ehJzA 9935
Attention_and_Transformers/ViT/vision_transformer.py sha256=SaXBFhjfHOKZVjdpUPRdcuCvjaQZ0FVVrFXpy5RLUag 8331
Attention_and_Transformers-0.0.15.dist-info/LICENSE sha256=b6jvk18LSLxX4oPofpvj81rit61Xcf5RgVakBFnW0YY 1091
Attention_and_Transformers-0.0.15.dist-info/METADATA sha256=vbjaO94Uzheq3QkGPpwYm1snXfgNqgcaoj4C6zSEZjw 5756
Attention_and_Transformers-0.0.15.dist-info/WHEEL sha256=G16H4A3IeoQmnOrYV4ueZGKSjhipXx8zc8nu9FGlvMA 92
Attention_and_Transformers-0.0.15.dist-info/top_level.txt sha256=tjSxV0sZhx6aN9oRDRHdFtvUrXuzao4rxxBkIymd_a8 31
Attention_and_Transformers-0.0.15.dist-info/RECORD

top_level.txt

ATF
Attention_and_Transformers