-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rename the argument of dense encoder from layers to fc_layers #3930
base: master
Are you sure you want to change the base?
Conversation
@@ -65,7 +65,7 @@ class DenseEncoder(Encoder): | |||
def __init__( | |||
self, | |||
input_size, | |||
layers=None, | |||
fc_layers=None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@metamath1 Thank you for proposing this change. Do you think it would potentially cause backward compatibility issues if other users already passed layers
to the constructor? Thank you. /cc @arnavgarg1
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@alexsherstinsky Yes, that is right.
We have a file to add support for backward compatibility here: https://github.com/ludwig-ai/ludwig/blob/ea890d99e72fbc32fad8aae18070da9969ddb422/ludwig/utils/backward_compatibility.py
And an associated test file https://github.com/ludwig-ai/ludwig/blob/ea890d99e72fbc32fad8aae18070da9969ddb422/tests/ludwig/utils/test_backward_compatibility.py to add a test to make sure it works.
@metamath1 Are you able to add backward compatibility support and an associated test for it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🙇
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@metamath1 Actually, I am a bit confused about this PR - what's the intended scope and reason for the change?
I misunderstood the change - this may not require any backward incompatibility changes since you're modifying the argument in the encoder class as opposed to the schema directly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@alexsherstinsky Yes, that is right.
We have a file to add support for backward compatibility here: https://github.com/ludwig-ai/ludwig/blob/ea890d99e72fbc32fad8aae18070da9969ddb422/ludwig/utils/backward_compatibility.py
And an associated test file https://github.com/ludwig-ai/ludwig/blob/ea890d99e72fbc32fad8aae18070da9969ddb422/tests/ludwig/utils/test_backward_compatibility.py to add a test to make sure it works.
@metamath1 Are you able to add backward compatibility support and an associated test for it?
Here's the setup for creating multiple Linear layers in a dense encoder
config = {
'preprocessing': {
'split': {
'type': 'fixed',
'column': 'split'
}
},
'input_features': [
{
'name': 'image',
'type': 'vector',
'encoder': {
'type': 'dense',
# This part doesn't work.
# This part is ignored and always creates a layer with an output size of 256.
'fc_layers': [
{'output_size': 64},
{'output_size': 32}
]
}
}
],
'output_features': [
{'name': 'label', 'type': 'category'}
],
'trainer': {'epochs': 1, 'batch_size': 64, 'eval_batch_size': 128}
}
However, the fc_layers
setting does not work.
This is because the argument to the DenseEncoder is named layers
,
but the config dictionary object stores the setting as fc_layers
.
The settings in the user's fc_layers
are not being passed to the DenseEncoder.
Looking at the documentation, it seems that starting from version 0.7, layers
has been renamed to fc_layers
.
Since the layers
key is no longer created in the config dictionary object, specifying layers in the config like in older versions instead of fc_layers
won't work either.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This really needs to be fixed.
For retrocompatibility we can have a couple lines fo code that internally rename layers
to fc_layers
and warn the user of the change and that it is deprecated.
Also, if it was not working anyway, there's not a lot of damage in beckward incompatibility.
Finally, I'm surprised by the fact that the wrong keyword was not captured by either the config validar or the dereferencing of the kwargs, the should have been a check somewhere.
Unit Test Results 4 files ±0 4 suites ±0 9m 6s ⏱️ -29s Results for commit 274a647. ± Comparison against base commit ea890d9. This pull request skips 2 tests.
|
Seems to be important, need merge |
Code Pull Requests
Please provide the following:
https://gist.github.com/metamath1/0dc3893e11a887cdd2544372d9e46852