Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

efficientnet #946

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open

efficientnet #946

wants to merge 4 commits into from

Conversation

sufeidechabei
Copy link
Contributor

  1. Performance can't match the orginal paper (I don't have enough GPUS to train Imagenet. )
  2. It does't support hybrid now because mxnet doesn't have same padding.

Copy link
Contributor

@zhanghang1989 zhanghang1989 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add unitest for testing model forward

@sufeidechabei
Copy link
Contributor Author

How to add it, can you give me an example @zhanghang1989

@chinakook
Copy link
Member

@sufeidechabei You can use the symbol.Pooling op to emulate Same pad as it has param pooling_convention={'full', 'same', 'valid'}

@sufeidechabei
Copy link
Contributor Author

It only support Symbol instances.

@sufeidechabei
Copy link
Contributor Author

@chinakook

@chinakook
Copy link
Member

chinakook commented Sep 22, 2019

@sufeidechabei You can wrap a hybridblock to it or you can use hybridlambda.

@sufeidechabei
Copy link
Contributor Author

I used hybridblock to wrap as you say @chinakook

@zhreshold
Copy link
Member

Like @chinakook said please try to remove usage of .shape in the network.

@zhanghang1989
Copy link
Contributor

@zhanghang1989
Copy link
Contributor

Like @chinakook said please try to remove usage of .shape in the network.

@sufeidechabei I think you may allow an argument during the init function to fix the input shape

input_filters=int(options['i']),
output_filters=int(options['o']),
expand_ratio=int(options['e']),
id_skip=('noskip' not in block_string),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

id_skip is not used ?

@sufeidechabei
Copy link
Contributor Author

Okay, I will fix that.

1 similar comment
@sufeidechabei
Copy link
Contributor Author

Okay, I will fix that.

_add_conv(
self._se_reduce,
num_squeezed_channels,
active=False,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe there should be activation function in SE module.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I follow the pytorch efficientnet while it doesn't have activation function in SE block.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants