-
Notifications
You must be signed in to change notification settings - Fork 139
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question? How can this be applied to a basic GPT transformer text generator? #11
Comments
ff is very limiting right now, but i think it is something one can work on. |
It would be very beneficial for more fine tuning be especially for edge device models like those in the STM32 model zoo ie: tensorflow lite and onnx models for micro controllers. Last year I was interested in this and wrote to [email protected] a post grad who did a few sample repos on using Forward Forward from Geoffery Hintons paper examples, the MINST working code he had but Hadia's work was a bit to complex for me i think and could not get a functioning example using Forward Forward in GPT type NN. But this would be a great subject of research and could allow for super low resource ML and open up Analog computation for not just inference of models but maybe even training. I see it as a kind of digital/analog biology way to train machines. meh i digress. |
Agreed, the potential of FF is immense, spanning from edge devices to controllers. Currently, my team and I are researching FF, focusing on computer vision. Could you please direct me to Hadia's work? so that i could take a look. |
yes sorry, here is the repo I experimented with. https://github.com/ghadialhajj/FF_unsupervised |
do you know of how to apply this to a GPT like model, I assumed a from scratch pytorch model.
The text was updated successfully, but these errors were encountered: