pyspark-ai

View on PyPIReverse Dependencies (1)

0.1.21 pyspark_ai-0.1.21-py3-none-any.whl

Wheel Details

Project: pyspark-ai
Version: 0.1.21
Filename: pyspark_ai-0.1.21-py3-none-any.whl
Download: [link]
Size: 35431
MD5: ec5df2a3d6a5b08d4e316b53a75db15f
SHA256: da68be0e7209c4561931f67d539095980a4b70162159bc9044788ddc2e465d53
Uploaded: 2024-03-11 22:13:51 +0000

dist-info

METADATA

Metadata-Version: 2.1
Name: pyspark-ai
Version: 0.1.21
Summary: English SDK for Apache Spark
Author: Gengliang Wang
Author-Email: gengliang[at]apache.org
Home-Page: https://github.com/databrickslabs/pyspark-ai
License: Apache-2.0
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.8
Requires-Python: >=3.9,<4.0
Requires-Dist: beautifulsoup4 (<5.0,>=4.12); extra == "ingestion" or extra == "all"
Requires-Dist: faiss-cpu (<2.0,>=1.7); extra == "vector-search" or extra == "all"
Requires-Dist: google-api-python-client (<3.0,>=2.90); extra == "ingestion" or extra == "all"
Requires-Dist: grpcio (>=1.56.0); extra == "spark-connect" or extra == "all"
Requires-Dist: grpcio-status (>=1.56.0); extra == "spark-connect" or extra == "all"
Requires-Dist: langchain (<0.2,>=0.1)
Requires-Dist: langchain-community (<0.1,>=0.0)
Requires-Dist: openai (<2.0,>=1.0)
Requires-Dist: pandas (>=1.0.5); extra == "plot" or extra == "all"
Requires-Dist: plotly (<6.0,>=5.15); extra == "plot" or extra == "all"
Requires-Dist: pyarrow (>=4.0.0); extra == "plot" or extra == "all"
Requires-Dist: pydantic (<2.0,>=1.10)
Requires-Dist: pygments (<3.0,>=2.15)
Requires-Dist: requests (<3.0,>=2.31); extra == "ingestion" or extra == "all"
Requires-Dist: sentence-transformers (<3.0,>=2.2); extra == "vector-search" or extra == "all"
Requires-Dist: tiktoken (<0.5,>=0.4); extra == "ingestion" or extra == "all"
Requires-Dist: torch (!=2.0.1,!=2.1.0,>=2.0.0); extra == "vector-search" or extra == "all"
Provides-Extra: all
Provides-Extra: ingestion
Provides-Extra: plot
Provides-Extra: spark-connect
Provides-Extra: vector-search
Description-Content-Type: text/markdown
[Description omitted; length: 8211 characters]

WHEEL

Wheel-Version: 1.0
Generator: poetry-core 1.6.1
Root-Is-Purelib: true
Tag: py3-none-any

RECORD

Path Digest Size
pyspark_ai/__init__.py sha256=FYZ73NEFT8nxpBSs0IFTc9EyC1o2jJJvsWOsJF6AUg0 101
pyspark_ai/ai_utils.py sha256=V9Cp82qZesYZUMExHKEv8xWpEaBPCOZtI6tHGN-HYOA 5006
pyspark_ai/cache.py sha256=-vZNe-OMWjd40CjxeEHECjJeyLsbqcsdN0i68KYCGKs 2740
pyspark_ai/code_logger.py sha256=b6oGsNt7-3Jt1jTT8r0WSUU69qkUIKxXba7hj9xI0yQ 2280
pyspark_ai/file_cache.py sha256=cnGCdQLgFWT3Z9D9pXJRxJ93uVBEEZsQ9C_kUjnYn2c 5240
pyspark_ai/llm_chain_with_cache.py sha256=LPIezVsRZ2qmXZd6aJPLthIgIQZmukfZmTSca9QlUD8 980
pyspark_ai/prompt.py sha256=bVwFY_o0grsT0UcftKG4JiypolZlpYWiowtCx5QaHtA 17668
pyspark_ai/pyspark_ai.py sha256=LleZ6b9hST-CfzLzqadBeol9NajEP0jbo-XlPjF8Y9c 26887
pyspark_ai/python_executor.py sha256=DuLnRV0ycI31ceYYniHGBqCtcRyNguCbq8XpxsTaAt8 3417
pyspark_ai/react_spark_sql_agent.py sha256=LcrHM4-Ad9hzzlmiX-bV-cLi9TxJSRuft1R1dHyTbis 1669
pyspark_ai/search_tool_with_cache.py sha256=yQrsVL7bV5TkxkQ4KR2MFro3VDIlxt5sfVCK6UezKGA 713
pyspark_ai/spark_sql_chain.py sha256=IbrBSDw7YquVslirrp1bNUKLbUxlt2pQAlFCV0b8TAg 2466
pyspark_ai/spark_utils.py sha256=iZExR8eztpH2TYFdlYjfnlQb-3nYqj7Qf5Ru1BRVDBk 327
pyspark_ai/temp_view_utils.py sha256=xGU0qe-V_jPdRRCXepcUPIC_i9IMeEnjMQ1gE_K5ZSg 1048
pyspark_ai/tool.py sha256=u7g56NDcuP7nVuLB8V_ZEUPpUvnYLIkrQb870ECiy1g 9947
pyspark_ai-0.1.21.dist-info/LICENSE sha256=xx0jnfkXJvxRnG63LTGOxlggYnIysveWIZ6H3PNdCrQ 11357
pyspark_ai-0.1.21.dist-info/METADATA sha256=iSM1dmFhFDA8TL-KaUtuTX9-gNgow8uCUEyGSD0gT_Y 10211
pyspark_ai-0.1.21.dist-info/WHEEL sha256=Zb28QaM1gQi8f4VCBhsUklF61CTlNYfs9YAZn-TOGFk 88
pyspark_ai-0.1.21.dist-info/RECORD