Reverse Dependencies of google-cloud-aiplatform
The following projects have a declared dependency on google-cloud-aiplatform:
- agent-llm — An Artificial Intelligence Automation Platform. AI Instruction management from various providers, has an adaptive memory, and a versatile plugin system with many commands including web browsing. Supports many AI providers and models and growing support every day.
- ai-workbench — GCP Vertex AI high level SDK
- algorin-cli — Acceso a GPT-3 y procesamiento de documentos desde la línea de comandos.
- allms — no summary
- apache-airflow — Programmatically author, schedule and monitor data pipelines
- apache-airflow-providers-google — Provider package apache-airflow-providers-google for Apache Airflow
- apache-beam — Apache Beam SDK for Python
- arize-phoenix — AI Observability and Evaluation
- auto-tensorflow — Build Low Code Automated Tensorflow, What-IF explainable models in just 3 lines of code. To make Deep Learning on Tensorflow absolutely easy for the masses with its low code framework and also increase trust on ML models through What-IF model explainability.
- autodistill-gemini — Model for use with Autodistill
- autovf — autovf: tuning xgboost with optuna
- axlearn — AXLearn
- biblesearch — Search Bible AI - Integrate Unique Bible App resources with AI tools
- biblesearchai — Search Bible AI - Integrate Unique Bible App resources with AI tools
- block-cascade — Library for model training in multi-cloud environment.
- captur-ml — The internal Captur Machine Learning SDK
- chope-data-haidilao — Featurestore wrapper
- cloud-accelerator-diagnostics — Monitor, debug and profile the jobs running on Cloud accelerators like TPUs and GPUs.
- crfm-helm — Benchmark for language models
- custom-workflow-solutions — Programmatically author, schedule and monitor data pipelines
- cxr-foundation — CXR Foundation: chest x-ray embeddings generation.
- danoliterate — Benchmark of Generative Large Language Models in Danish
- devai-cli — no summary
- distilabel — AI Feedback (AIF) framework
- dlrag-dev — test
- ds-planner — A package for zap platform
- dspy-ai — DSPy
- embedchain — Simplest open source retrieval (RAG) framework
- exponential — A Python package with a built-in web application
- fedml-gcp — A python library for building machine learning models on Google Cloud Platform using a federated data source
- fondant — Fondant - Large-scale data processing made easy and reusable
- gdm-concordia — A library for building a generative model of social interacions.
- genai-apis — GenAI APIs provides a unified API callers to Gemini API, OpenAI API, and Anthropic API.
- gigachain-google-vertexai — An integration package connecting Google VertexAI and LangChain
- gocodeo — A package to generate unit tests for a file
- google-cloud-mlflow — MLflow Google Cloud Vertex AI integration package
- google-cloud-pipeline-components — This SDK enables a set of First Party (Google owned) pipeline components that allow users to take their experience from Vertex AI SDK and other Google Cloud services and create a corresponding pipeline using KFP or Managed Pipelines.
- google-vertex-haystack — no summary
- googleaistudio — Gemini Pro & PaLM 2. A collection of AI tools built on Google Vertex AI APIs. These are part of integrated tools, developed in LetMeDoIt AI project.
- grazier — A tool for calling (and calling out to) large language models.
- green-agent — no summary
- hoodat-vertex-components — Re-usable kfp components for hoodat
- idoctorai — Idoctor AI is a Python library that integrates generative artificial intelligence capabilities into Pandas.
- inspect-ai — Framework for large language model evaluations
- kfp-toolbox — The toolbox for kfp (Kubeflow Pipelines SDK)
- langchain-google-vertexai — An integration package connecting Google VertexAI and LangChain
- languru — The general purpose LLM app stacks.
- llama-index-embeddings-vertex — llama-index embeddings vertex integration
- llama-index-llms-vertex — llama-index llms vertex integration
- llama-index-vector-stores-vertexaivectorsearch — llama-index vector_stores Vertex AI Vector Search integration
- llm-vertex — Plugin for LLM adding support for Google Cloud Vertex AI
- llmware — An enterprise-grade LLM-based development framework, tools, and fine-tuned models
- log10-io — Unified LLM data management
- mayan-document-classifier — Document classifier
- milocode — The all-in-one voice SDK
- modelsmith — Get Pydantic models and Python types as LLM responses from Google Vertex AI models.
- nebula-gcp — Nebula tasks and subflows for interacting with Google Cloud Platform.
- openplugin — no summary
- pandasai — Chat with your database (SQL, CSV, pandas, polars, mongodb, noSQL, etc). PandasAI makes data analysis conversational using LLMs (GPT 3.5 / 4, Anthropic, VertexAI) and RAG.
- pano-airflow — Programmatically author, schedule and monitor data pipelines
- pascalnobereit-langchain-google-vertexai — An integration package connecting Google VertexAI and LangChain
- persona_ai — Persona is a groundbreaking distributed AI agent system utilizing Google Vertex AI's premier Large Language Models such as Gemini-pro, Text-Bison, and Code-Bison. Developed by Applica Software Guru, Persona is engineered for scalable, high-performance, and intelligent operations across various data sets and applications. It harnesses the power of Google's Vertex AI to deliver unmatched insights and automation capabilities, setting new standards in AI-driven analytics and decision-making processes
- phasellm — Wrappers for common large language models (LLMs) with support for evaluation.
- pillar1 — Official package for Pillar1 company
- pr-agent — CodiumAI PR-Agent aims to help efficiently review and handle pull requests, by providing AI feedbacks and suggestions.
- prefect-gcp — Prefect integrations for interacting with Google Cloud Platform.
- PromptMeteo — Enable the use of LLMs as a conventional ML model
- proxyllm — LLM Proxy to reduce cost and complexity of using multiple LLMs
- prr — prr - command-line LLM prompt runner
- pyllms — Minimal Python library to connect to LLMs (OpenAI, Anthropic, Google Palm2/Vertex, Mistral, Ollama, AI21, Cohere, Aleph-Alpha, HuggingfaceHub), with a built-in model performance benchmark.
- pyrovelocity — A multivariate RNA Velocity model to estimate future cell states with uncertainty using probabilistic modeling with pyro.
- python-jsonllm — LLM please cast to JSON
- redisvl — Python client library and CLI for using Redis as a vector database
- refuel-autolabel — Label, clean and enrich text datasets with LLMs
- scikit-llm — Scikit-LLM: Seamlessly integrate powerful language models like ChatGPT into scikit-learn for enhanced text analysis tasks.
- searchbible — Search Bible AI - Integrate Unique Bible App resources with AI tools
- searchbibleai — Search Bible AI - Integrate Unique Bible App resources with AI tools
- semantic-chunkers — Super advanced chunking methods for AI
- semantic-router — Super fast semantic router for AI decision making
- symposium — Interaction of multiple language models
- text-machina — Text Machina: Seamless Generation of Machine-Generated Text Datasets
- tfx — TensorFlow Extended (TFX) is a TensorFlow-based general-purpose machine learning platform implemented at Google.
- tfx-helper — A helper library for TFX
- the-real-genotools — A collection of tools for genotype quality control and analysis
- unipipe — project_description
- vanna — Generate SQL queries from natural language
- vdf-io — This library uses a universal format for vector datasets to easily export and import data from all vector databases.
- vertex_ai_huggingface_inference_toolkit — 🤗 Hugging Face Inference Toolkit for Google Cloud Vertex AI (similar to SageMaker's Inference Toolkit, but unofficial)
- vertex-deployer — Check, compile, upload, run, and schedule Kubeflow Pipelines on GCP Vertex AI in a standardized manner.
- vertexai — Please run pip install vertexai to use the Vertex SDK.
- vesslflow — VESSLFlow
- vocode — The all-in-one voice SDK
- voice-stream — A streaming library for creating voice bots using LLMs. Connects LLMs to speech recognition and speech synthesis APIs.
- wandb — A CLI and library for interacting with the Weights & Biases API.
- wanna-ml — CLI tool for managing ML projects on Vertex AI
- xmanager — A framework for managing machine learning experiments
1