Skip to content

Latest commit

 

History

History
 
 

research

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 

PromptKG

A collection of prompt learning-related research model implementations

RetroPrompt: retrieval-augmented prompt learning to decouple knowledge from memorization

Demo-Tuning: contrastive demonstration tuning for natural language processing

RetrievalRE: retrieval-enhanced prompt tuning for relation extraction

GenKGC: link prediction as sequence-to-sequence generation for fast inference

PromptKGC: data-efficient prompt learning-based knowledge graph completion

News

  • [Model Release] September, 2022: RetroPrompt - A retrieval mechanism during the process of input, training and inference, thus equipping the model with the ability to retrieve related contexts from the training corpus as cues for enhancement.
  • [Model Release] Aprial, 2022: Demo-Tuning - A pluggable, extensible, and efficient approach named contrastive demonstration tuning, which is free of demonstration sampling
  • [Model Release] Aprial, 2022: RetrievalRE - A retrieval-enhanced prompt tuning method for relation extraction which empowers the model to reference similar instances from the training corpus as cues for inference
  • [Model Release] Jaunary, 2022: GenKGC - A sequence-to-sequence approach for knowledge graph completion.
  • [Model Release] January, 2022: PromptKGC - A prompt learning-based approach for few-shot knowledge graph completion

Release

***** September, 2022: RetroPrompt release *****

***** Aprial, 2022: Demo-Tuning | RetrievalRE release *****

***** January, 2022: GenKGC | PromptKGC release *****

  • GenKGC (Jaunary 31, 2022): GenKGC converts knowledge graph completion to sequence-to-sequence generation with pre-trained language model with relation-guided demonstration and entity-aware hierarchical decoding. It can obtain better or comparable performance than baselines, and achieve faster inference speed compared with previous methods with pre-trained language models. "From Discrimination to Generation: Knowledge Graph Completion with Generative TransformerWWW 2022 "
  • PromptKGC (Jaunary 31, 2022): A prompt-tuning approach (knowledge collaborative fine-tuning) for low-resource knowledge graph completion, which leverages the structured knowledge to construct the initial prompt template and learn the optimal templates, labels and model parameters through a collaborative fine-tuning algorithm. It can obtain state-of-the-art few-shot performance on FB15K-237, WN18RR, and UMLS.

Contact Information

For help or issues using the models, please submit a GitHub issue.