Skip to content

Repurposing InceptionV3, VGG16, and ResNet50 using bottlenecking

Notifications You must be signed in to change notification settings

galenballew/transfer-learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Using Transfer Learning to Capitalize on State of the Art Networks

Repurposing InceptionV3, VGG16, and ResNet50

Read my full write-up with visualizations on my website galenballew.github.io

Or check out the article on Medium.

The Challenge: Some of the most advanced convolutional neural networks are available with their weights already trained. This starting state can produce enormous leverage for your next image classification task. This project shows how the bottlenecking technique can be employed to significantly increase the training speed for the repurposed network.

The Toolkit:

  • TensforFlow
  • scikit-learn
  • Keras
  • numpy
  • D3.js

The Results: Check out my website for the D3 visualization of training accuracy/loss for the different networks.

About

Repurposing InceptionV3, VGG16, and ResNet50 using bottlenecking

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published