pyllama

View on PyPIReverse Dependencies (4)

0.0.9 pyllama-0.0.9-py3-none-any.whl

Wheel Details

Project: pyllama
Version: 0.0.9
Filename: pyllama-0.0.9-py3-none-any.whl
Download: [link]
Size: 51510
MD5: e81fcc8e3d410ea017e570c3850d113d
SHA256: c7b008f0d3a819cc90de58e533f8a961dc149ab1ae14b140f56a5cb320b92601
Uploaded: 2023-03-23 07:01:37 +0000

dist-info

METADATA

Metadata-Version: 2.1
Name: pyllama
Version: 0.0.9
Summary: 🦙 LLaMA: Open and Efficient Foundation Language Models in A Single GPU
Author: Juncong Moo;Meta AI
Author-Email: JuncongMoo[at]gmail.com
Home-Page: https://github.com/juncongmoo/pyllama
Keywords: LLaMA
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Requires-Dist: torch (>=1.12.0)
Requires-Dist: fairscale (>=0.4.13)
Requires-Dist: fire (~=0.5.0)
Requires-Dist: hiq-python (>=1.1.9)
Requires-Dist: sentencepiece (==0.1.97)
Requires-Dist: transformers (>=4.26.0); extra == "full"
Requires-Dist: gptq (>=0.0.2); extra == "full"
Requires-Dist: sentencepiece (>=0.1.97); extra == "full"
Requires-Dist: torch (>=1.12.0); extra == "full"
Requires-Dist: fairscale (>=0.4.13); extra == "full"
Requires-Dist: fire (~=0.5.0); extra == "full"
Requires-Dist: hiq-python (>=1.1.9); extra == "full"
Requires-Dist: sentencepiece (==0.1.97); extra == "full"
Requires-Dist: transformers (>=4.26.0); extra == "quant"
Requires-Dist: gptq (>=0.0.2); extra == "quant"
Requires-Dist: sentencepiece (>=0.1.97); extra == "quant"
Provides-Extra: full
Provides-Extra: quant
Description-Content-Type: text/markdown
License-File: LICENSE
[Description omitted; length: 10092 characters]

WHEEL

Wheel-Version: 1.0
Generator: bdist_wheel (0.37.1)
Root-Is-Purelib: true
Tag: py3-none-any

RECORD

Path Digest Size
llama/__init__.py sha256=z_X6VtDIpQ6Q_tNwV3_FpOoZYcs5zIL7X1vs2lHWN4w 484
llama/convert_llama.py sha256=cSTMVr4aQ2ot6g_BGhW2hJ_9ZKfMGQfXhxD3EKckMuQ 13542
llama/download.py sha256=OG_WMjD3M1VYe8LbBrW60QtvaGMUEIgBLU57zsnnyR8 1073
llama/download_community.sh sha256=aj5bQOmyG5ode__pPxuJ2xi7jQJohxkMZ5c3cPbFY8I 2042
llama/generation.py sha256=Eg5w4TyvuF-w6E-G-QX1W77129W8jQIFLjJ-8Uc_8Ts 4735
llama/llama_infer.py sha256=W3gR-Nc5P9If9Dsw_IfYsMRCsAzp6v6YbOTaY8Psgmw 2654
llama/llama_multigpu.py sha256=RT2hW1ACtq0iCaIBTUYdKn9E49RX1BN9xl4WA6eF6uU 2582
llama/llama_quant.py sha256=ZlpBlXR8HskUPhuuernYdCTomQE9Iyx5EM3cpWWukeI 15584
llama/model_parallel.py sha256=RIHGkjMfHRFWc69MqJ2NzbH6xpOmZF9fW_TjHqONrTY 8415
llama/model_single.py sha256=DvKBdEbg87rO15p63sNtrQ2AH9ZTnYq05YGWNzvvyVA 7502
llama/tokenizer.py sha256=Eo7U7YwVeOngLHQ7FCWcamCNAZV_3QWCtd9CfBxxLlI 1749
llama/version.py sha256=46Yjk3fz9o8aTN8E95McnzpJcjGzVJmHmQqUZ5mXzfc 22
llama/hf/__init__.py sha256=CNcKm9aQhlV4uGb0KqHijB9lFkG1NUMGPiGTxmelRSY 2183
llama/hf/configuration_llama.py sha256=82vLlsD2MkSyJNY4o8ll_3oym2Ugb7BhyJt-o2FRsCI 4739
llama/hf/modeling_llama.py sha256=G5An4LkMVlqZ8Lnc0kqgDtIX4UHaXDO6QBI1_V6cH8g 38496
llama/hf/tokenization_llama.py sha256=mxgXFU7g_BIHKbeH_MG1Pm4rspf1Q59P_nLnOMDt63E 7987
llama/hf/utils.py sha256=mJvgicKNblW3na7LpASmYsGS6M4dIWTWQ-XloEq53d4 428
pyllama-0.0.9.dist-info/LICENSE sha256=hGDSSng8ArXjH6Oz2u5DyV8sSaStQrrZKNzgH7Bri0g 35149
pyllama-0.0.9.dist-info/METADATA sha256=1jpRhZkk0aiEZ_a7ybf4FhovEc5LAcjnOedKJX7_2r0 11690
pyllama-0.0.9.dist-info/WHEEL sha256=G16H4A3IeoQmnOrYV4ueZGKSjhipXx8zc8nu9FGlvMA 92
pyllama-0.0.9.dist-info/top_level.txt sha256=vXIfhTMeFFl71QrVVrWCJGS0tzlBB5vynlR-iGsCqZ4 6
pyllama-0.0.9.dist-info/RECORD

top_level.txt

llama