Skip to content

Implementation of the Neuromodulated Transformer (NeMoT), a new transformer architecture that entwines neuromodulation with the transformer architecture.

License

Notifications You must be signed in to change notification settings

Strong-AI-Lab/Neuromodulated-Transformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

The Neuromodulated Transformer (NeMoT)

This repository contains code for an extension to the Transformer via the integration of neuromodulation. The goal is to explore neuromodulation and its potential impacts on generalisation in QA. The architecture is coded in TensorFlow and can be found in the models folder.

Datasets

Code to load all datasets is found in the load_datasets folder. We utilise the Colossol Cleaned Crawled Corpus (C4) for pre-training our model; LAMBADA, WikiText, and PTB to test our pre-trained model's language modelling capabilities; performance in QA is measured on ARC, BoolQ, CommonsenseQA, DROP, MCTest, NarrativeQA, OBQA, PIQA, Quoref, ROPES, SIQA, WG, RACE, and SQuADv2.

Training

All training files can be found in the training folder.

About

Implementation of the Neuromodulated Transformer (NeMoT), a new transformer architecture that entwines neuromodulation with the transformer architecture.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages