-
Notifications
You must be signed in to change notification settings - Fork 417
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using VGG nets #20
Comments
Thanks, forgot to make that one public. Should be good now. There's an example of preprocessing (for VGG_S) here, but your preprocess function looks correct to me. Are you having problems? Adding something like that to the modelzoo files seems like a good idea to me. |
I haven't gotten that far yet. I'll try it out and if it works I'll add a preprocessing function for VGG. Any clue why do they use BGR ? The model setup is nice and so far very easy to use :) |
Apparently it's because Caffe uses OpenCV which uses BGR. We could swap it in the weights, but I think that could confuse people who looked at the Caffe Model Zoo page. |
I have a follow up question then. Have all of the models in the model zoo here been trained on BGR or just VGG? Do you think this will remain a standard? |
I ran into an issue when trying to unpickle VGG19 normalized https://s3.amazonaws.com/lasagne/recipes/pretrained/imagenet/vgg19_normalized.pkl in python 3 - no problems in python 2. Is that a compatibility problem with pickle itself? |
What issue? What's the output you're getting? Probably best to add a separate ticket, too. |
In the MNIST example there is some trickery to make the dataset (pickle file) load properly in Python 3. I guess it's because Python 3 assumes utf-8 encoding unless otherwise instructed: https://github.com/Lasagne/Lasagne/blob/master/examples/mnist.py#L34-L45 |
Here were my attempts, before I decided to use numpy's save and load in order to get the data into python 3. values = pickle.load(open('vgg19_normalized.pkl'))['param values'] output: UnicodeDecodeError Traceback (most recent call last)
<ipython-input> in <module>()
----> 1 values = pickle.load(open('vgg19_normalized.pkl'))['param values']
/home/python3/python3_install/lib/python3.4/codecs.py in decode(self, input, final)
317 # decode input (taking the buffer into account)
318 data = self.buffer + input
--> 319 (result, consumed) = self._buffer_decode(data, self.errors, final)
320 # keep undecoded input until the next call
321 self.buffer = data[consumed:]
UnicodeDecodeError: 'utf-8' codec can't decode byte 0x80 in position 0: invalid start byte using the mnist trickery: values = pickle.load(open('vgg19_normalized.pkl', encoding='latin-1'))['param values'] output: TypeError Traceback (most recent call last)
<ipython-input> in <module>()
----> 1 values = pickle.load(open('vgg19_normalized.pkl', encoding='latin-1'))['param values']
TypeError: 'str' does not support the buffer interface Reading as bytes: values = pickle.load(open('vgg19_normalized.pkl', 'rb'))['param values'] output: UnicodeDecodeError Traceback (most recent call last)
<ipython-input-10> in <module>()
----> 1 values = pickle.load(open('vgg19_normalized.pkl', 'rb'))['param values']
UnicodeDecodeError: 'ascii' codec can't decode byte 0xbc in position 1: ordinal not in range(128) Am I doing something wrong? |
Yes, you didn't correctly copy the MNIST trickery. It should be with open('vgg19_normalized.pkl', 'rb') as f:
values = pickle.load(f, encoding='latin-1')['param values'] I.e., pass In the discussion of #21, we've pondered converting the models to numpy's |
Could/should we provide a layer for converting from RGB to (bhwc) to BGR (bchw) similar to https://github.com/nicholas-leonard/dpnn/blob/master/Convert.lua |
@ebenolson any ideas on how could we use a pre-trained ImageNet model to fine-tune a dataset of images with 4-channels? (RGB + NIR) Would using PCA to transform the 4-channels into 3 would work? Are there any other ideas? |
I doubt so. The PCA components would probably be too different from RGB.
You could try extending the first-layer filter tensor to have 4 input channels, initialize the additional filters randomly with a carefully-chosen scale and then slowly train. If you fear this would destroy existing weights or not give the extra information enough consideration, you could try to add a separate branch and fuse the additional infrared information later. Good luck! |
Hi!
This happens both with and without the |
for py3, you need to use:
just in case, |
Thank you, Eben! That solved it. |
https://s3.amazonaws.com/lasagne/recipes/pretrained/imagenet/vgg19.pkl sorry to ask again, how to login and download the vgg16.pkl and vgg19.pkl models? |
They seem to be gone, I also need them to run a third-party network. Does anybody have a copy? |
Hi! |
Yes, unfortunately it was getting too costly, see #115. The post before links to academictorrents.com, does that work for you? We're happy to have them hosted somewhere else and update the URLs, if you know a good solution. (Zenodo?) |
Hi! |
@priyanshuvarsh Did you find any fix for your problem ? |
@davidtellez - Using that academic torrent link gives error as pointed out by @priyanshuvarsh . |
Please follow the instructions in the readme. All files are still accessible, you will need an AWS account and a requester-pays client. |
I'm trying to use the VGG-16 net with pretrained weights.
The link https://s3.amazonaws.com/lasagne/recipes/pretrained/imagenet/vgg16.pkl does not seem to be public?
@ebenolson : I can download the file if I log in with the information you gave me.
I'm not sure how i should pre-process my data to make the model work. I looked at the preprocessing description in the repo:
I guess i should do something like (not tested):
Maybe we should at preprocess functions to the modelzoo?
The text was updated successfully, but these errors were encountered: