memory-efficient-attention-pytorch

View on PyPIReverse Dependencies (1)

0.1.6 memory_efficient_attention_pytorch-0.1.6-py3-none-any.whl

Wheel Details

Project: memory-efficient-attention-pytorch
Version: 0.1.6
Filename: memory_efficient_attention_pytorch-0.1.6-py3-none-any.whl
Download: [link]
Size: 14978
MD5: 53b9aec23552dce379a643ef4f60fd5d
SHA256: efbb2676f8695b21a29d96d83f84818be257a35ac4c89f94d7d93f59819d38ed
Uploaded: 2023-07-18 02:42:55 +0000

dist-info

METADATA

Metadata-Version: 2.1
Name: memory-efficient-attention-pytorch
Version: 0.1.6
Summary: Memory Efficient Attention - Pytorch
Author: Phil Wang
Author-Email: lucidrains[at]gmail.com
Home-Page: https://github.com/lucidrains/memory-efficient-attention-pytorch
License: MIT
Keywords: artificial intelligence,deep learning,attention-mechanism
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3.8
Requires-Dist: einops (>=0.4.1)
Requires-Dist: torch (>=1.6)
Description-Content-Type: text/markdown
License-File: LICENSE
[No description]

WHEEL

Wheel-Version: 1.0
Generator: bdist_wheel (0.40.0)
Root-Is-Purelib: true
Tag: py3-none-any

RECORD

Path Digest Size
memory_efficient_attention_pytorch/__init__.py sha256=dC2mA1-wE9ta_mC_8JssBFbF6XjoCinNatiNHdEM2Pg 343
memory_efficient_attention_pytorch/autoregressive_wrapper.py sha256=vSzRXgAGWffRULbr_s8o6yinsDgDyaiuZIS1QNAiGXI 2059
memory_efficient_attention_pytorch/cosine_sim_flash_attention.py sha256=bzddblTy7QeveCHdQ99GsgwjJ8hkVujvCwllMETydW4 6976
memory_efficient_attention_pytorch/flash_attention.py sha256=zw8_W5ln_2VH4vOBswAWfGJ__A3ycUEzJ5ugqQkNV94 7754
memory_efficient_attention_pytorch/memory_efficient_attention.py sha256=uFL6F3m5pd7wdJ6e2bhWNdzKlgqqF--gAyR9KtJTPAY 7158
memory_efficient_attention_pytorch/memory_efficient_cosine_sim_attention.py sha256=XxFkGZcGrJo-8Kms5tCVffYOV_ISOZlpIpk-FdiGAIE 6356
memory_efficient_attention_pytorch/reversible.py sha256=RS-L3RnKKxV-pvoEDkrXaD8NP8aBp4HmfY5-BewDb5Y 4725
memory_efficient_attention_pytorch/transformer.py sha256=9lttnDZrnpjz8jpw7d5lOwSfdKJtBRJNSb8_f-kpRf4 2580
memory_efficient_attention_pytorch-0.1.6.dist-info/LICENSE sha256=xZDkKtpHE2TPCAeqKe1fjdpKernl1YW-d01j_1ltkAU 1066
memory_efficient_attention_pytorch-0.1.6.dist-info/METADATA sha256=egDXs9nt4_VRvDsL8VRKEYyrHgLD3QIBJ8mf0V_AcOc 717
memory_efficient_attention_pytorch-0.1.6.dist-info/WHEEL sha256=pkctZYzUS4AYVn6dJ-7367OJZivF2e8RA9b_ZBjif18 92
memory_efficient_attention_pytorch-0.1.6.dist-info/top_level.txt sha256=qsHVCeWEsnNIL78HjNj6X8Iyb3hDNtiMLkDvrmd-ofU 35
memory_efficient_attention_pytorch-0.1.6.dist-info/RECORD

top_level.txt

memory_efficient_attention_pytorch