Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Continue training #22

Open
romanovzky opened this issue Oct 8, 2024 · 2 comments
Open

Continue training #22

romanovzky opened this issue Oct 8, 2024 · 2 comments
Assignees
Labels
enhancement New feature or request

Comments

@romanovzky
Copy link

Hi,

I was testing whether I could fit a SymbolicRegressor up to, say, 1000 generations, see the Pareto front, and then continue training for another 1000 generations. However, it seems that if I do

reg = SymbolicRegressor()
reg.fit(X_train, y_train)

play with reg and then

reg.fit(X_train, y_train)

again, the reg object is the same as before the second fit call. Now, given that reg has a Pareto front, wouldn't I be able to continue fitting "a la" online learning/batch/partial fit way? I'm trying to "brute force" a work-around for the lack of callbacks (see #18).

Cheers

@foolnotion
Copy link
Member

Hi, this looks like an easy improvement, but it will still require some changes to the C++ library. Currently, each call to fit initializes a new C++ algorithm, runs it, and keep some stats and results (like the pareto front) from it. But when fit is done the C++ object doesn't exist anymore. However, it should be easy enough to implement a kind of warm start mechanism.

@romanovzky
Copy link
Author

Yes, a warm start mechanism would be super useful! Already thinking about the possibility of using things like hyperband, which ideally need warm start mechanisms.

@foolnotion foolnotion self-assigned this Oct 13, 2024
@foolnotion foolnotion added the enhancement New feature or request label Oct 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants