Reverse Dependencies of databricks-sdk
The following projects have a declared dependency on databricks-sdk:
- acryl-datahub — A CLI to work with DataHub metadata
- airflow-tools — no summary
- apache-airflow-providers-databricks — Provider package apache-airflow-providers-databricks for Apache Airflow
- brickflows — Deploy scalable workflows to databricks using python
- cdpdev-datahub — A CLI to work with DataHub metadata
- composer — Composer is a PyTorch library that enables you to train neural networks faster, at lower cost, and to higher accuracy.
- dagster-databricks — Package for Databricks-specific Dagster framework op and resource components.
- databricks-azure-ad-sync-provider — Default template for PDM package
- databricks-connect — Databricks Connect Client
- databricks-genai — Interact with the Databricks Generative AI APIs in python
- databricks-genai-inference — Interact with the Databricks Foundation Model API from python
- databricks-labs-blueprint — Common libraries for Databricks Labs
- databricks-labs-lsql — Lightweight stateless SQL execution for Databricks with minimal dependencies
- databricks-labs-pylint — Plugin for PyLint to support Databricks specific code patterns and best practices.
- databricks-labs-remorph — SQL code converter and data reconcilation tool for accelerating data onboarding to Databricks from EDW, CDW and other ETL sources.
- databricks-labs-ucx — UCX - Unity Catalog Migration Toolkit
- databricks-rag-studio — Databricks RAG Studio Library
- databricks-sqlalchemy-oauth — SQLAlchemy OAuth connector to Databricks
- db-az-sync-provider — A PDM package to sync Azure users, roles and service principals to Databricks
- dbt-databricks — The Databricks adapter plugin for dbt
- dbtunnel — Run app and get cluster proxy url for it in databricks clusters
- dlt-meta — DLT-META Framework
- featurebyte — Python Library for FeatureOps
- gradiently — no summary
- iMapHub — Library created to map two Dataset
- integral_deid — PHI taggging and redaction
- laktory — A DataOps framework for building a lakehouse
- mapGlobaltoLocal — Library created to map two Dataset
- metaphor-connectors — A collection of Python-based 'connectors' that extract metadata from various sources to ingest into the Metaphor app.
- mlrpc — Deploy FastAPI applications on MLFlow
- mlrun — Tracking and config of machine learning runs
- mosaicml — Composer is a PyTorch library that enables you to train neural networks faster, at lower cost, and to higher accuracy.
- mosaicml-streaming — Streaming lets users create PyTorch compatible datasets that can be streamed from cloud-based object stores
- mymaplib-123 — Library created to map two Dataset
- openmetadata-ingestion — Ingestion Framework for OpenMetadata
- pfore-cloud-utilities — Provides utility functions for cloud-based workflows.
- pyjaws — no summary
- quollio-core — Quollio Core
- rtdip-sdk — no summary
- serra — Simplified Data Pipelines
- shipyard-databricks — A local client for connecting and working with Databricks
- shipyard-databricks-sql — A local client for connecting and working Databricks SQL Warehouses
- spetlr — A python ETL libRary (SPETLR) for Databricks powered by Apache SPark.
- tecton-parallel-retrieval — [private preview] Parallel feature retrieval on Databricks for Tecton
- testlib123 — Library created to map two Dataset
- unstructured — A library that prepares raw documents for downstream ML tasks.
1