Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Return tuple of loop state and transformed state #382

Closed
wants to merge 3 commits into from

Conversation

Ian-GL
Copy link
Contributor

@Ian-GL Ian-GL commented Oct 12, 2022

If I understood correctly #362 , the new output of the transform should be {%Axon.Loop.State{...}, transformed_state}, so this PR implements said change and updates the tests. This should be the case for other default transforms too? So far I just changed trainer but I guess evaluator should follow suit. Is that right @seanmor5 ?

Closes #362

lib/axon/loop.ex Outdated
@@ -536,7 +536,7 @@ defmodule Axon.Loop do
# Build loss now so we can use it as a metric
loss_fn = build_loss_fn(loss)
{init_fn, step_fn} = train_step(model, loss_fn, optimizer)
output_transform = fn state -> state.step_state[:model_state] end
output_transform = fn state -> {state, state.step_state[:model_state]} end
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What I have in mind is keeping this the same fn state -> state.step_state[:model_state] end

And rather in run returning:

{state, output_transform.(state)}

Does that make sense?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that does make sense. I've made the changes here bd92764 . Should specific tests be added? I've just updated the current ones.

@Ian-GL Ian-GL requested a review from seanmor5 October 19, 2022 21:39
@seanmor5
Copy link
Contributor

@Ian-GL I am still considering the implications of this...it may be that we end up essentially copying the model state twice which is inefficient. I will revisit this after I finish a few other things for the Axon release

@seanmor5
Copy link
Contributor

@Ian-GL I have thought about this more and unfortunately I don't think this is the best route either. I am going to close this PR for now, but will ping you on #362 when I have a better solution!

@seanmor5 seanmor5 closed this Oct 27, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Default loop output transforms are too intrusive
2 participants