Skip to content

Creating custom routines in Tensorflow has never been easy. Mostly because of the complexities involved in writing the code. With tensorflow 2.0 eager-execution and GradientTape, I find it relatively easier to write models while avoiding the confusing sub-classing APIs that Tensorflow provides. This repository contains some of the code that I wr…

Notifications You must be signed in to change notification settings

ahsanmemon/Gradient-Tape

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Gradient-Tape

Just started out with this repo. Will try to keep things standalone so you can download the notebooks and just run them without any additional dependencies. But that, I will do later :)

Info

Creating custom routines in Tensorflow has never been easy. Mostly because of the complexities involved in writing the code. With Tensorflow 2.0 eager-execution and GradientTape, I find it relatively easier to write models while avoiding the confusing sub-classing APIs that Tensorflow provides. This repository contains some of the code that I wrote to understand and implement models using GradientTape instead of the classical model.fit or model.compile methods that Keras and Tensorflow 2 provide.

Items

- MNIST Classification with Gradient Tape
- Triplet Loss and loss optimization using the Gradient Tape

About

Creating custom routines in Tensorflow has never been easy. Mostly because of the complexities involved in writing the code. With tensorflow 2.0 eager-execution and GradientTape, I find it relatively easier to write models while avoiding the confusing sub-classing APIs that Tensorflow provides. This repository contains some of the code that I wr…

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published