Reverse Dependencies of botocore
The following projects have a declared dependency on botocore:
- qai-hub-models — Models optimized for export to run on device.
- qary — A chatbot that assists rather than manipulates.
- qbiz-airflow-presto — A containerized Presto cluster for AWS.
- qcloud-user — Q-Cloud CLI for users
- qiskit-aws-braket-provider — A provider for qiskit to access quantum devices through AWS Braket
- qontract-reconcile — Collection of tools to reconcile services with their desired state as defined in the app-interface DB.
- querent — The Asynchronous Data Dynamo and Graph Neural Network Catalyst
- question-creation-app-realpython — Question creator
- r2connect — R2Connect is a powerful Python module designed for seamless integration between AWS S3 and Cloudflare's R2 service. It offers a simple and intuitive interface to create, manage, and synchronize buckets, objects, and data, facilitating efficient backend operations in a reliable and secure manner. Streamline your S3 and R2 interactions with R2Connect.
- rag-doc-search — This package offers a lightweight and straightforward solution for implementing Retrieval-augmented generation (RAG) functionality with large language models (LLMs).
- ragna-aws — AWS extensions for Ragna
- ralph-malph — Ralph, the ultimate Learning Record Store (and more!) for your learning analytics.
- ras-stac — Create SpatioTemporal Asset Catalog (STAC) objects from HEC-RAS model data.
- rate-limiter-py — Rate-limiter module which leverages DynamoDB to enforce resource limits.
- rcd-dev-kit — Interact with OIP ecosystem.
- rcd-pyutils — A package contains simple utils for accessing database/cloud storage (GCP, S3, AWS redshift, Elasticsearch, Snowflake, MySQL), pandas dataframe quality check, data type and length detector, debug decorator, etc...
- reagan — Package for streamlining credentials, connections, and data flow
- rectvision — A low-code tool to help create your own AI
- redshift-connector — Redshift interface library
- refinery-python-sdk — Official Python SDK for Kern AI refinery.
- refresh-my-ip — Refresh your ''my IP' in AWS security groups
- RegScale-CLI — Command Line Interface (CLI) for bulk processing/loading data into RegScale
- reminderlib — Sample reminder library for AWS dynamodb
- requests-aws-iam-auth — An AWS IAM authentication package for Requests. Supported services: API Gateway v1
- requests-cache — A persistent cache for python requests
- resoto-plugin-aws — Runs collector plugins and sends the result to resotocore.
- resoto-plugin-digitalocean — Resoto DigitalOcean Collector Plugin
- resotodata — Miscellaneous Resoto data.
- retake-pgsync — Postgres to Elasticsearch/OpenSearch sync
- retakesearch-py — Python client for OpenSearch
- ReviewBoardPowerPack — Enhances Review Board with PDF review and diffing, reports and analytics, new source code management services, and more.
- rh-test-grnotebook — Notebook for interacting with models in AIConfig
- rikai — no summary
- roboto — Tools for interacting with roboto.ai
- robusta-cli — no summary
- rohmu — "Rohmu is a python library providing an interface to various cloud storage providers."
- runhouse — Runhouse CLI and Python Package
- s1-cns-cli — SentinelOne CNS CLI is an extension of our vision to shift-left security with SentinelOne CNS.
- s3-as-a-datastore — S3-as-a-datastore is a library that lives on top of botocore and boto3, as a way to use S3 as a key-value datastore instead of a real datastore
- s3-as-a-service — S3-as-a-datastore is a library that lives on top of botocore and boto3, as a way to use S3 as a key-value datastore instead of a real datastore
- s3-cas — S3 Content-Addressable Storage
- s3-client — Sample python script to work with Amazon S3.
- s3-dog-food — A small example package
- s3-folder-backup — Backup Folders to S3
- S3-Inspect — A package to inspect contents of S3 buckets and generate report
- s3-metadata-tagger — A package to add metadata tags to objects saved in s3
- s3-npcmr — A set of use cases of boto3 wrapped into a module
- s3-site-maker — Create a static webstie in an S3 Bucket
- s3aads — S3-as-a-datastore is a library that lives on top of botocore and boto3, as a way to use S3 as a key-value datastore instead of a real datastore
- S3Adapter — A AWS S3 Python Adapter to Readn, Write and Check existence of files in S3 Buckets.
- s3bp — Read and write Python objects from/to S3.
- s3dol — s3 (through boto3) with a simple (dict-like or list-like) interface
- s3file — Python file-like proxy for opening s3 files
- s3namic — A Python package for managing AWS S3 bucket
- S3netCDF4 — A library to facilitate the storage of netCDF files on ObjectStores in an efficient manner.
- s3publish — Publish a folder of html to AWS S3
- s3tk — A security toolkit for Amazon S3
- s3transfer — An Amazon S3 Transfer Manager
- s3transfer-meiqia — An Amazon S3 Transfer Manager
- saas-co — no summary
- safe-backup — This program creates a secure backup of your files from a specified directory or an object storage location.
- sagecreator — Package to orchestrate architecture in AWS
- sagemode — Deploy, scale, and monitor your ML models all with one click. Native to AWS.
- sageworks — SageWorks: A Python WorkBench for creating and deploying AWS SageMaker Models
- sahara — Sahara project
- sahara-tests — Sahara tests
- salesdredge — This is the API for the Salesdredge.
- samgenericservices — Classes and functions to use for development of intregration software
- sample-helper-aws-appconfig — Sample helper library for AWS AppConfig
- sangreal-odo — Data migration utilities
- satip — Satip provides the functionality necessary for
- sbcommons — Packages shared between several Data related systems in Haypp Group
- sciwing — Modern Scientific Document Processing Framework
- scoutr — Generic full access control API for talking to a NoSQL database
- ScoutSuite — Scout Suite, a multi-cloud security auditing tool
- scraper-util-avliu — no summary
- scrapy-feedstreaming — Based on scrapy.extensions.feedexport.FeedExporter to live stream data
- scrapy-kinesispipeline — Scrapy pipeline to store aggregated items into AWS Kinesis
- scrapy-logexport — Upload scrapy logs to cloud storage
- scrapy-omdena-latam — Web Crawling application running Scrapy Tool extracting official policies
- scrapy-s3logstorage — Upload scrapy logs to S3
- SCRIdb — A platform to handle sequencing data submission and initiation of projects, and to interact with the lab's database - insert meta data, and interactively pull reports and views.
- scsims — Scalable, Interpretable Deep Learning for Single-Cell RNA-seq Classification
- sdgym — Benchmark tabular synthetic data generators using a variety of datasets
- sdutilities — This package is intended to implement uniformity across SD Data Science projects.
- sdv — Generate synthetic data for single table, multi table and sequential data
- SecretManagerCredentials — Convenient wrapper for executing codes on AWS for connecting and fetching credentials from vault
- secretsmanager-illumidesk — IllumiDesk secretsmanager package
- securesystemslib — A library that provides cryptographic and general-purpose routines for Secure Systems Lab projects at NYU
- seda — A Python toolkit to build Serverless Event-Driven Applications on AWS.
- seer-pas-sdk — SDK for Seer Proteograph Analysis Suite (PAS)
- sentinelhub — Python API for Sentinel Hub
- sherlockml — Python library for interacting with SherlockML.
- shopify-prefect-tasks — no summary
- sidewinder-db — A Python-based Distributed Database
- simple-AWS — Simplified AWS Functions
- simple-queue — A lightweight and ready to use SQS worker implementation.
- SimpleDocumentStore — A library to store the documents to the local file system and AWS S3
- simpleiot-cli — SimpleIOT command line interface
- simunetcore — Co simulation package