Reverse Dependencies of pytorch-pretrained-bert
The following projects have a declared dependency on pytorch-pretrained-bert:
- capreolus — A toolkit for end-to-end neural ad hoc retrieval
- distil-primitives — Distil primitives as a single library
- finbert-embedding — Embeddings from Financial BERT
- finntk — Finnish NLP toolkit
- fitbert — Use BERT to Fill in the Blanks
- imix — multimodal deep learning framework
- mask-predictor — A wrapper of BERT to predict the covered word in a sentence
- menli — MENLI metrics v1
- neuspell — NeuSpell: A Neural Spelling Correction Toolkit
- OpenNIR-XPM — OpenNIR: A Complete Neural Ad-Hoc Ranking Pipeline (Experimaestro version)
- pre-ai-python — Microsoft AI Python Package
- pydata-wrangler — Wrangle messy data into pandas DataFrames, with a special focus on text data and natural language processing
- pytext-nlp — pytorch modeling framework and model zoo for text models
- ravestate — Ravestate is a reactive library for real-time natural language dialog systems.
- sciwing — Modern Scientific Document Processing Framework
- sikufenci — NLP tool for Ancient Chinese word segmentation.
- summ-eval — Toolkit for summarization evaluation
- textcl — Text preprocessing package for use in NLP tasks
- TUPA — Transition-based UCCA Parser
1