🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
Dec 5, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
A flexible package for multimodal-deep-learning to combine tabular data with text and images using Wide and Deep models in Pytorch
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
💁 Awesome Treasure of Transformers Models for Natural Language processing contains papers, videos, blogs, official repo along with colab Notebooks. 🛫☑️
Minimalist NMT for educational purposes
Automatically split your PyTorch models on multiple GPUs for training & inference
Based on the Pytorch-Transformers library by HuggingFace. To be used as a starting point for employing Transformer models in text classification tasks. Contains code to easily train BERT, XLNet, RoBERTa, and XLM models for text classification.
医药知识图谱自动问答系统实现,包括构建知识图谱、基于知识图谱的流水线问答以及前端实现。实体识别(基于词典+BERT_CRF)、实体链接(Sentence-BERT做匹配)、意图识别(基于提问词+领域词词典)。
Minimal implementation of Decision Transformer: Reinforcement Learning via Sequence Modeling in PyTorch for mujoco control tasks in OpenAI gym
HugsVision is a easy to use huggingface wrapper for state-of-the-art computer vision
Label data using HuggingFace's transformers and automatically get a prediction service
Implementation of the paper Video Action Transformer Network
This shows how to fine-tune Bert language model and use PyTorch-transformers for text classififcation
State-of-the-art NLP through transformer models in a modular design and consistent APIs.
A little Python application to auto tag your photos with the power of machine learning.
A better PyTorch data loader capable of custom image operations and image subsets
Instructions for how to convert a BERT Tensorflow model to work with HuggingFace's pytorch-transformers, and spaCy. This walk-through uses DeepPavlov's RuBERT as example.
Generative Pretrained Transformer 2 (GPT-2) for Language Modeling using the PyTorch-Transformers library.
Utilizing webscraping and state-of-the-art NLP to generate TV show episode summaries.
Determine the polarity of amazon fine food reviews using ULMFiT, BERT, XLNet and RoBERTa
Add a description, image, and links to the pytorch-transformers topic page so that developers can more easily learn about it.
To associate your repository with the pytorch-transformers topic, visit your repo's landing page and select "manage topics."