Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Edit hyperparameter tuning vignette #230

Closed
NLesniak opened this issue Nov 16, 2020 · 3 comments · Fixed by #232
Closed

Edit hyperparameter tuning vignette #230

NLesniak opened this issue Nov 16, 2020 · 3 comments · Fixed by #232
Assignees
Labels
documentation Improvements or additions to documentation

Comments

@NLesniak
Copy link
Collaborator

NLesniak commented Nov 16, 2020

  • Add definition of hyperparameter to intro paragraph (lines 18-19)
  • When describing alpha can range between param 0 and 1, unclear what the following "inclusive" means (line 74)
  • It would be helpful to explicitly state how the model picks final hyperparameter value and use that to talk about how we know what is best for multiple splits (line 114)
  • In SVM hyperparameters - could you more explain what sigma is, "defines how far the influence of a single training example reaches" is not clear? (line 208)
  • In xgboost, hyperparameters are not defined (line 219) and confusing wording "need to be appropriate in relation" (line 220)
@kelly-sovacool kelly-sovacool added the documentation Improvements or additions to documentation label Nov 16, 2020
@BTopcuoglu
Copy link
Collaborator

I'll edit to have one sentence explanations for each hyperparameter but then refer people to ML courses/papers for more detail like @kelly-sovacool suggested. Do you think that's reasonable @zenalapp?

@zenalapp
Copy link
Collaborator

Sounds good to me!

@BTopcuoglu
Copy link
Collaborator

I'm done with my edits. We can close this once merge #232

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants