-
Notifications
You must be signed in to change notification settings - Fork 12
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
8 changed files
with
19,701 additions
and
19 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,21 +1,31 @@ | ||
🏗️ 🚧 This library is under construction!🚧 | ||
=========================================== | ||
⚡🧠⚡ Welcome to Spyx! ⚡🧠⚡ | ||
========================== | ||
|
||
Welcome to Spyx, a library for Spiking Neural Networks in JAX! | ||
![README Art](spyx.png "Spyx") | ||
|
||
![README Art](spyx.png "Title") | ||
|
||
Spyx is a compact library built on top of DeepMind's Haiku library, enabling easy construction of spiking neural network models. | ||
Spyx is a compact spiking neural network library built on top of DeepMind's Haiku library. | ||
|
||
The goal of Spyx is to provide similar capabilities as SNNTorch for the JAX ecosystem, opening up the possibility to incorporate SNNs into a number of GPU-accelerated reinforcement learning environments. Additionally, JAX has become home to several libraries for neuroevolution, and the aim is for Spyx to provide a common framework to compare modern neuroevolution algorithms with surrogate gradient and ANN2SNN conversion techniques. | ||
|
||
In the future the aim for Spyx is to include tools for building and training spiking phasor networks and building an interface for exporting models to the emerging Neuromorphic Intermediate Representation for deployment on efficient hardware. | ||
The future aim for Spyx is to include tools for building and training spiking phasor networks and building an interface for exporting models to the emerging Neuromorphic Intermediate Representation for deployment on efficient hardware. | ||
|
||
Installation: | ||
============= | ||
|
||
As with other libraries built on top of JAX, you need to install jax with GPU if you want to get the full benefit of this library. Directions for installing JAX with GPU support can be found at the following: https://github.com/google/jax#installation | ||
|
||
Additionally, the data loading is dependent on Tonic, a library for neuromorphic datasets. You will have to install it seperately to avoid creating headaches between dependencies for JAX and PyTorch. | ||
|
||
https://tonic.readthedocs.io/en/latest/getting_started/install.html | ||
|
||
Hardware Requirements: | ||
====================== | ||
|
||
Installation | ||
============ | ||
Spyx achieves extremely high performance by maintaining the entire dataset in the GPU's vRAM; as such a decent amount of memory for both the CPU and GPU are needed to handle the dataset loading and then training. For smaller networks of only several hundred thousand parameters, the training process can be comfortably executed on even laptop GPU's with only 6GB of vRAM. For large SNNs or for neuroevolution it is recommended to use a higher memory card. | ||
|
||
This library is developed on Ubuntu 22.04 wwith the LambdaLabs stack for handling deep learning frameworks. | ||
Since Spyx is developed on top of the current JAX version, it does not work on Google Colab's TPUs which use an older version. Cloud TPU support will be tested in the near future. Support for GraphCore's IPU's could be possible based on their fork of JAX but has not been explored. | ||
|
||
As with other libraries built on top of JAX, you need to install jax with GPU if you want to get the full benefit of this library. | ||
Why use Spyx? | ||
============= | ||
|
||
Directions for installing JAX with GPU support can be found at the following: https://github.com/google/jax#installation | ||
Other frameworks such as SNNTorch and Norse offer a nice range of features such as training with adjoint gradients or support for IPUs in addition to their wonderful tutorials. Spyx is designed to maximize performance by achieving maximum GPU utilizattion, allowing the training of networks for hundreds of epochs at incredible speed. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,4 +1,6 @@ | ||
Introduction | ||
============ | ||
|
||
Intro coming soon! | ||
Intro coming soon! | ||
|
||
In the meantime, check out the examples folder on the GitHub page to see an example workflow. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,4 +1,4 @@ | ||
Quickstart | ||
========== | ||
|
||
Here's a fast tutorial for using Spyx to train your first SNN! | ||
Check out the examples folder on the GitHub for now, quickstart coming soon. |
Oops, something went wrong.