Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
-
Updated
Sep 3, 2024 - Python
Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
Entity and Relation Extraction Based on TensorFlow and BERT. 基于TensorFlow和BERT的管道式实体及关系抽取,2019语言与智能技术竞赛信息抽取任务解决方案。Schema based Knowledge Extraction, SKE 2019
Trained models & code to predict toxic comments on all 3 Jigsaw Toxic Comment Challenges. Built using ⚡ Pytorch Lightning and 🤗 Transformers. For access to our API, please email us at [email protected].
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Portuguese pre-trained BERT models
Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models
BlueBERT, pre-trained on PubMed abstracts and clinical notes (MIMIC-III).
A Model for Natural Language Attack on Text Classification and Inference
Abstractive summarisation using Bert as encoder and Transformer Decoder
Sentiment analysis neural network trained by fine-tuning BERT, ALBERT, or DistilBERT on the Stanford Sentiment Treebank.
Multiple-Relations-Extraction-Only-Look-Once. Just look at the sentence once and extract the multiple pairs of entities and their corresponding relations. 端到端联合多关系抽取模型,可用于 http://lic2019.ccf.org.cn/kg 信息抽取。
基于神经网络的通用股票预测模型 A general stock prediction model based on neural networks
Phoneme-Level BERT for Enhanced Prosody of Text-to-Speech with Grapheme Predictions
使用BERT模型做文本分类;面向工业用途
an easy-to-use interface to fine-tuned BERT models for computing semantic similarity in clinical and web text. that's it.
Can we use explanations to improve hate speech models? Our paper accepted at AAAI 2021 tries to explore that question.
Universal User Representation Pre-training for Cross-domain Recommendation and User Profiling
Simple State-of-the-Art BERT-Based Sentence Classification with Keras / TensorFlow 2. Built with HuggingFace's Transformers.
Add a description, image, and links to the bert-model topic page so that developers can more easily learn about it.
To associate your repository with the bert-model topic, visit your repo's landing page and select "manage topics."