Hugging-Face-Supporter
/
Multi-Label-Classification-of-Pubmed-Articles-Deployed-on-HuggingFace-Spaces
Public
forked from Owaiskhan9654/Multi-Label-Classification-of-Pubmed-Articles
-
Notifications
You must be signed in to change notification settings - Fork 1
/
index.html
31 lines (31 loc) · 4 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
<html><head><style>body {
color: black;
}
</style></head><body><h1 id="multi-label-classification-of-pubmed-articles">Multi-Label-Classification-of-Pubmed-Articles</h1>
<p>Live at Huggingface <a href="https://huggingface.co/spaces/owaiskha9654/Multi-Label-Classification-of-Pubmed-Articles">Here</a>. </p>
<p>The traditional machine learning models give a lot of pain when we do not have sufficient labeled data for the specific task or domain we care about to train a reliable model. Transfer learning allows us to deal with these scenarios by leveraging the already existing labeled data of some related task or domain. We try to store this knowledge gained in solving the source task in the source domain and apply it to our problem of interest. In this work, I have utilized Transfer Learning utilizing BIOBERT model to fine tune on Pubmed MultiLabel classification Datset. </p>
<p>Also tried <strong>RobertaForSequenceClassification</strong> and <strong>XLNetForSequenceClassification</strong> models for Fine-Tuning the Model on Pubmed MultiLabel Datset.</p>
<p><img src="https://raw.githubusercontent.com/Owaiskhan9654/DigiGene/main/Paper%20Night%20Design.gif"></p>
<p><img src="https://camo.githubusercontent.com/dd842f7b0be57140e68b2ab9cb007992acd131c48284eaf6b1aca758bfea358b/68747470733a2f2f692e696d6775722e636f6d2f52557469567a482e706e67"></p>
<blockquote>
<p>I have integrated Weight and Bias for visualizations and logging artifacts and comparisons of different models!</p>
<p>Multi Label Classification of PubMed Articles Weight and Biases Different Model training Logs <a href="https://wandb.ai/owaiskhan9515/Multi%20Label%20Classification%20of%20PubMed%20Articles%20(Paper%20Night%20Presentation">Links</a>)</p>
<ul>
<li>To get the API key, create an account in the <a href="https://wandb.ai/site">website</a> .</li>
<li>Use secrets to use API Keys more securely inside Kaggle. </li>
</ul>
</blockquote>
<p><img src="https://raw.githubusercontent.com/Owaiskhan9654/Gene-Sequence-Primer-/main/BioAsq.JPG"></p>
<p>For more information on the attributes visit the Kaggle Dataset Description <a href="https://www.kaggle.com/datasets/owaiskhan9654/pubmed-multilabel-text-classification">here</a>.</p>
<h4 id="in-order-to-get-a-full-grasp-of-what-steps-should-i-be-taking-to-utilizing-this-dataset-have-a-full-look-at-the-dataset-and-information-present-in-the-kaggle-notebook-link-https-www-kaggle-com-code-owaiskhan9654-multi-label-classification-of-pubmed-articles-kaggle-dataset-link-https-www-kaggle-com-datasets-owaiskhan9654-pubmed-multilabel-text-classification-">In order to, get a full grasp of what steps should I be taking to utilizing this dataset. Have a Full look at the Dataset and information present in the Kaggle Notebook <a href="https://www.kaggle.com/code/owaiskhan9654/multi-label-classification-of-pubmed-articles">Link</a> & Kaggle Dataset <a href="https://www.kaggle.com/datasets/owaiskhan9654/pubmed-multilabel-text-classification">Link</a></h4>
<h2 id="-p-style-background-color-1a0a36-font-family-newtimeroman-color-fff9ed-font-size-150-text-align-center-border-radius-10px-10px-references-p-"><p style="background-color:#1a0a36;font-family:newtimeroman;color:#FFF9ED;font-size:150%;text-align:center;border-radius:10px 10px;"> References</p></h2>
<ol>
<li><a href="https://arxiv.org/abs/1706.03762">Attention Is All You Need</a></li>
<li><a href="https://arxiv.org/abs/1810.04805">BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding</a></li>
<li><a href="https://github.com/google-research/bert">https://github.com/google-research/bert</a></li>
<li><a href="https://github.com/huggingface/transformers">https://github.com/huggingface/transformers</a></li>
<li><a href="https://pytorch.org/docs/stable/generated/torch.nn.BCEWithLogitsLoss.html#torch.nn.BCEWithLogitsLoss">BCE WITH LOGITS LOSS Pytorch</a></li>
<li><a href="https://towardsdatascience.com/transformers-for-multilabel-classification-71a1a0daf5e1">Transformers for Multi-Label Classification made simple by
Ronak Patel</a></li>
</ol>
</body></html>