You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
For some model configurations (Poisson GLM, soft-plus, with Ridge regularization) and optimal stepsize and batch size can be calculated for SVRG according to the theory in Sebbouh et al. 2019.
However, for an un-regularized model, the strong-convexity hypothesis does not hold. What should we do in this case? How about for Lasso and GroupLasso?
Describe the solution you'd like
Define what to do for non-strongly convex problems and/or non-L-smooth problems.
The text was updated successfully, but these errors were encountered:
This relaxes the strong convexity assumption and adjusts the stepsize, and does not require knowledge of the smoothness constant. It has a projection of the parameters into a compact set, but in our case the parameter can live in $\mathbb{R}^k$.
I am not sure if removing the projection step will make the algorithm potentially unstable.
Is your feature request related to a problem? Please describe.
For some model configurations (Poisson GLM, soft-plus, with Ridge regularization) and optimal stepsize and batch size can be calculated for SVRG according to the theory in Sebbouh et al. 2019.
However, for an un-regularized model, the strong-convexity hypothesis does not hold. What should we do in this case? How about for Lasso and GroupLasso?
Describe the solution you'd like
Define what to do for non-strongly convex problems and/or non-L-smooth problems.
The text was updated successfully, but these errors were encountered: