Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add defaultSettings functions for models #33

Open
2 of 3 tasks
egillax opened this issue Aug 29, 2022 · 3 comments
Open
2 of 3 tasks

add defaultSettings functions for models #33

egillax opened this issue Aug 29, 2022 · 3 comments
Labels
enhancement New feature or request

Comments

@egillax
Copy link
Collaborator

egillax commented Aug 29, 2022

Since deep learning models can be very complicated with many hyperparameters it might be a bit overwhelming to use the current settings function.

I will implement defaultSetting functions with some default sane hyperparameters to use if the user's only want to fit one model. These hyperparameters will be either found from literature if the model's are implemented from a paper or from testing by the package authors.

  • setDefaultResnet
  • setDefaultMLP
  • setDefaultTransformer
@egillax egillax added the enhancement New feature or request label Aug 29, 2022
@ChungsooKim
Copy link
Collaborator

It was quite overwhelming to me also, because the default setting of the setResnet() function contains a hundred combinations of parameters.
My test results were as below.

test server : Intel Xeon CPU E5 2620 v3 (2CPUs 6Cores 24 Threads) / RAM: 492G

Cohort Size: 105580
Covariates: 7512
Outcome: 1033
data split: 75(train):25(test)
fold : 3
device: CPU
for 1 epochs it tooks 2~3 min

the default setting was 100 combinations of parameters with 30 epochs each.
that means it tooks about 2min30epochs3 folds * 100 param pairs + 2min *30 epochs * 1 training (3 times size than each fold) + 2 min * 1 test = 303+hrs (of course it can be reduced significantly with early stopping)

if we have appropriate default settings for each algorithms, it would be greate for researchers for just testing their possibility with DNN algorithms.

@egillax
Copy link
Collaborator Author

egillax commented Sep 7, 2022

Could you share what hyperparameters were chosen in the end? I could compare with mine and the paper I got the model from. If they are all very similar we could use those as default. Some of them could depend though on the dataset, like it's size or number of covariates.

@ChungsooKim
Copy link
Collaborator

Sure! I'm fitting the ResNet model with 50 random parameter sets now, after it is done, I'll share it here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants