8 Google Employees Invented Modern AI. Here’s the Inside Story | WIRED
Transformers were introduced in this 2017 paper as a tool for sequence transduction—converting one sequence of symbols to another. The most popular examples of this are translation, as in English to German. It has also been modified to perform sequence completion—given a starting prompt, carry on in the same vein and style. They have quickly become an indispensible tool for research and product development in natural language processing.
The Python Library
Transformers provides APIs and tools to easily download and train state-of-the-art pretrained models.. save you the time and resources required to train a model from scratch. These models support common tasks in different modalities, such as:
- NLP - text classification, named entity recognition, question answering, language modeling summerization, translation, mulitple choice, and text generation.
- Computer vision - image classification, object detection, and segmentation
- Audio - automatic speech recognition and audio classification
- Multimodal - table question answering, optical character recognition, information extraction from scanned documents, video classification and visual question answering.
transformers support interop between pytorch, tensorflow, and jax. WIth this one can use a diff framework at each stage of a model’s lifecycle; train, load, and inference. models can be exported to format like ONNX and torchscript for deployment in prod envs.
Read the documentation here : https://huggingface.co/docs/transformers/index
Sentence Transformers (sbert)
Python module for accessing, using, and training state-of-the-art text and image embedding models. It can be used to compute embeddings using Sentence Transformer models (quickstart) or to calculate similarity scores using Cross-Encoder models (quickstart). This unlocks a wide range of applications, including semantic search, semantic textual similarity, and paraphrase mining.