Skip to content

TransformerCPI: Improving compound–protein interaction prediction by sequence-based deep learning with self-attention mechanism and label reversal experiments https://doi.org/10.1093/bioinformatics/btaa524

License

Notifications You must be signed in to change notification settings

WangYitian123/transformerCPI

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TransformerCPI: Improving compound–protein interaction prediction by sequence-based deep learning with self-attention mechanism and label reversal experiments

This repository contains the source code and the data.

TransformerCPI

Setup and dependencies

Dependencies:

  • python 3.6
  • pytorch >= 1.2.0
  • numpy
  • RDkit = 2019.03.3.0
  • pandas
  • Gensim >=3.4.0

Data sets

The data sets with train/test splits are provided as .7z file in a directory called 'data'.

The test set is created specially for label reversal experiments.


Using

1.mol_featurizer.py generates input for TransformerCPI model.

2.main.py trains TransformerCPI model.


Author

Lifan Chen ([email protected])

Mingyue Zheng([email protected])

Citation

Lifan Chen, Xiaoqin Tan, Dingyan Wang, Feisheng Zhong, Xiaohong Liu, Tianbiao Yang, Xiaomin Luo, Kaixian Chen, Hualiang Jiang, Mingyue Zheng, TransformerCPI: Improving compound–protein interaction prediction by sequence-based deep learning with self-attention mechanism and label reversal experiments, Bioinformatics, , btaa524, https://doi.org/10.1093/bioinformatics/btaa524

About

TransformerCPI: Improving compound–protein interaction prediction by sequence-based deep learning with self-attention mechanism and label reversal experiments https://doi.org/10.1093/bioinformatics/btaa524

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%