Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Argument error in attention layer initialization. #22

Open
fahad92virgo opened this issue Jun 17, 2016 · 1 comment
Open

Argument error in attention layer initialization. #22

fahad92virgo opened this issue Jun 17, 2016 · 1 comment

Comments

@fahad92virgo
Copy link

I am running the provided plot notebook in the attention model. when I run this line.

import experiments.attention_models.baseline_model
reload(experiments.attention_models.baseline_model)
from experiments.attention_models.baseline_model import get_network

model_path = os.path.join(ROOT, "experiments/attention_models/models/mnist_att_params2.gz")
network = get_network(model_path, disable_reinforce=True)

i get this error:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-6-7d813e2daac2> in <module>()
      4 
      5 model_path = os.path.join(ROOT, "experiments/attention_models/models/mnist_att_params2.gz")
----> 6 network = get_network(model_path, disable_reinforce=True)

D:\Python Directory\winPython 2.7\deepy\experiments\attention_models\baseline_model.py in get_network(model, std, disable_reinforce, random_glimpse)
    211     """
    212     network = NeuralClassifier(input_dim=28 * 28)
--> 213     network.stack_layer(AttentionLayer(std=std, disable_reinforce=disable_reinforce, random_glimpse=random_glimpse))
    214     if model and os.path.exists(model):
    215         network.load_params(model)

D:\Python Directory\winPython 2.7\deepy\experiments\attention_models\baseline_model.py in __init__(self, activation, std, disable_reinforce, random_glimpse)
     23         self.gaussian_std = std
     24         #super(AttentionLayer, self).__init__(activation)
---> 25         super(AttentionLayer, self).__init__(10, activation)
     26 
     27     def initialize(self, config, vars, x, input_n, id="UNKNOWN"):

TypeError: __init__() takes at most 2 arguments (3 given)

I error occurs when the AttentionLayer class initializes its parent class 'NeuralLayer'

super(AttentionLayer, self).__init__(10, activation)

I looked at the NeuralLayer implementation and found that it indeed takes only one argument besides self.

class NeuralLayer(object):

    def __init__(self, name="unknown"):

Can you please look into this error.

@zomux
Copy link
Owner

zomux commented Jun 17, 2016

For attentional mechanism please see https://github.com/zomux/neuralmt.

Sorry, as in version 2.0, many major APIs are redeigned, so some codes for LM do not work. I will update them later

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants