PyTorch implementation of Infini-Transformer from "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention" (https://arxiv.org/abs/2404.07143)
-
Updated
May 4, 2024 - Python
PyTorch implementation of Infini-Transformer from "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention" (https://arxiv.org/abs/2404.07143)
An unofficial pytorch implementation of 'Efficient Infinite Context Transformers with Infini-attention'
Really simple unofficial PyTorch implementation of the InfiniAttention paper (https://arxiv.org/pdf/2404.07143).
This is an unofficial Pytorch implementation of the Infini Attention mechanism introduced in the paper : "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention". Note that the official code for the paper has not been released yet. In case of issues, add a PR (add an explanation of the changes made and why so?)
Add a description, image, and links to the infini-attention topic page so that developers can more easily learn about it.
To associate your repository with the infini-attention topic, visit your repo's landing page and select "manage topics."