The Hugging Face Transformers is an open-source library with the goal of opening up recent advances in machine learning and natural language processing driven by Transformer architectures and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks.
The library consists of carefully engineered state-of-the art Transformer architectures under a unified API. Backing this library is a curated collection of pretrained models made by and available for the community. The Transformers library is designed to be extensible by researchers, simple for practitioners, and fast and robust in industrial deployments.
Source: 🤗 Transformers: State-of-the-art Natural Language Processing for PyTorch and TensorFlow 2.0.