-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] Add GLM example with the Negative Binomial distribution. #392
base: master
Are you sure you want to change the base?
Conversation
Nice! can you explain somewhere why negative binomial is appropriate for this data? |
Sure. I'll expand the description of the example. |
I think this is clearer now. |
Excellent! I am going to try the code out later this afternoon :-) |
Co-authored-by: Mainak Jas <[email protected]>
Co-authored-by: Mainak Jas <[email protected]>
@geektoni I tried it. Something is off. When I plot the convergence: >>> glm_neg_bino.plot_convergence() I see this plot: it should go down monotonically this could explain why we don't see comparable results |
i saw the same issue in convergence plots when i tried to reproduce the R example. had to tweak (increase) learning rates to get it go down monotonically. |
showing poisson vs neg bin fits in our example could be useful. i also like this example: https://data.library.virginia.edu/getting-started-with-negative-binomial-regression-modeling/ it talks about where poisson assumptions are lacking and how neg bin could be a better choice. |
that's a bit odd. I can imagine non-monotonic with larger learning rates but I don't understand why it would be the case for smaller learning rates? |
@geektoni I pushed a bunch of fixes. However, I noticed that the link function they use is log which is different from the one we use. So we can't really expect equivalent solutions. I'm wondering if we should wait on #390 to be merged and try with a log link function (should be a couple of lines of code). I think it would be a good validation exercise. |
this is a great idea too! |
* Add convergence graph; * Add better information to show the program types; * Add Poisson regression to compare the results;
I guess we could add another example (besides this one) in which we show how to tweak an existing model to use a different link function. However, I remember that, at first, we tried coding the negative binomial with a I think it is fine if pyglmnet offers a model with a different link function than other libraries. We might not be able to compare everything one-to-one, but as long as the other tests/examples pass I think we can be fairly sure they do the same thing. |
I've added the Poisson regression to the example. As usual, it plots the learned betas and the convergence plot. |
@geektoni it might be really helpful to reproduce exactly an example from another source (i.e., not pyglmnet). I tried with log link by rebasing this branch against the |
* Make the example less verbose.
7169506
to
6274c06
Compare
@jasmainak @pavanramkumar I have added another example taken from the link above. This time it seems to me that we are getting the same results! |
Sounds promising, I'm going to look more closely tomorrow :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@geektoni sorry for taking so long, it has been a busy week. I have a couple of questions. If required, you and me can do a quick skype in the coming week to push this forward. I would really love to see this in the repo!
|
||
######################################################## | ||
# Plot convergence information for both negative binomial and poisson | ||
glm_nb.plot_convergence() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the learning rate seems too high ... this plot really should be monotonically decreasing and reach tol, otherwise you don't get the correct result
max_iter=5000, | ||
learning_rate=1e-1) | ||
glm_poisson.fit(Xdsgn, y) | ||
print(glm_poisson.beta0_, glm_poisson.beta_) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so the coefficient seems to match for Poisson: 1.73 but not for NB. And the intercept is different. Am I missing something?
pred_nb = np.array(glm_nb.predict([[0], [1]])) | ||
pred_poisson = np.array(glm_poisson.predict([[0], [1]])) | ||
print("") | ||
print("NB Predicted means+std (black/white): {}, {}".format(pred_nb, np.sqrt(pred_nb+(pred_nb**2)*1/0.20))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not sure I follow where this expression comes from but I guess we can clarify this later. First step is to make sure we get the same results
This example is related to issue #386.