Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adding new layers and supporting multiple pre-configured networks in net config #62

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

dlacombejr
Copy link
Contributor

Major changes include:

  1. Adding support for additional layers such as Batch Normalization and ReLU
  2. Support for multiple pre-configured networks in net_config.json. For example, now net_config.json can look like the following:
{
  "networks": [
    {
      "copies": 1,
      "description": "default network with online and slow train",
      "config": {
        "layers":
        [
          {"filter_shape": [1, 2], "filter_number": 3, "type": "ConvLayer"},
          {"filter_number":10, "type": "EIIE_Dense", "regularizer": "L2", "weight_decay": 5e-9},
          {"type": "EIIE_Output_WithW","regularizer": "L2", "weight_decay": 5e-8}
        ],
        "training":{
          "steps":40000,
          "learning_rate":0.00028,
          "batch_size":109,
          "buffer_biased":5e-5,
          "snap_shot":false,
          "fast_train":false,
          "training_method":"Adam",
          "loss_function":"loss_function6"
        },

        "input":{
          "window_size":31,
          "coin_number":11,
          "global_period":1800,
          "feature_number":3,
          "test_portion":0.08,
          "online":true,
          "start_date":"2018/01/01",
          "end_date":"2018/02/18",
          "volume_average_days":30
        },

        "trading":{
          "trading_consumption":0.0025,
          "rolling_training_steps":85,
          "learning_rate":0.00028,
          "buffer_biased":5e-5
        }
      }
    },
    {
      <additional networks...>
    }
  ]
}

The original net_config.json is still supported.

@dexhunter
Copy link
Collaborator

@dlacombejr Hi! Thanks for the PR. I don't quite understand why training input trading are included in the network of new config? I think only layers needs to be changed if testing which network architecture is better.

@dlacombejr
Copy link
Contributor Author

@dexhunter I guess my intention was really to allow there to be changes to any aspect of the configuration, not just network architecture. This approach leads to lengthy configuration files, but full access to any change in configuration (configurations that are undesired can be commented out because I load them using commentjson -- which I've added to requirements.txt. I suppose the configuration could be set up to support iterating over lists of training, input, trading and layers dictionaries as in a grid search, but this could blow up pretty quickly and you may only want certain combinations. Even though the current approach is more error-prone (e.g., if I want to just change layers between two configs but training is accidentally different), I think it is still preferable because there is full control. If it makes more sense, we can change the networks key to configs or something.

@ZhengyaoJiang
Copy link
Owner

Thanks for your contribution again!

Adding support for additional layers such as Batch Normalization and ReLU

Yes, this makes sense.

Support for multiple pre-configured networks in net_config.json. For example, now net_config.json can look like the following:

This implementation is conflict with our automatic hyper-parameters optimization architecture, which is not currently open sourced but it might be released in future.
My suggestion is to configure searching space in a separate file or maybe inside generate.py as a temporal solution of grid search.

And it would be nice if you can push the new "dev" branch instead of the master branch.

@sam-moreton
Copy link

@ZhengyaoJiang, will you be releasing a hyperparameter optimiser? I've seen Bayesian Optimizers work better for this task over grid search or random search

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants