Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make use of prefilled cache in beam_search when predict_batch_with_aux.prompt_with_targets is set to True #1388

Closed
wants to merge 0 commits into from

Conversation

copybara-service[bot]
Copy link

Make use of prefilled cache in beam_search when predict_batch_with_aux.prompt_with_targets is set to True

Without this change, the targets that are actually prompts will still be decoded AR style in order to build the cache, the logits are thrown away.

With this change, and setting EncoderDecoder.predict_batch_with_aux.prompt_with_target to True we instead skip over the given prompts all together, speeding up the decode.

@copybara-service copybara-service bot force-pushed the test_559256044 branch 3 times, most recently from a710dea to 864e95e Compare October 3, 2023 16:34
@copybara-service copybara-service bot force-pushed the test_559256044 branch 2 times, most recently from 340b634 to de7577e Compare October 4, 2023 21:29
@copybara-service copybara-service bot closed this Oct 4, 2023
@copybara-service copybara-service bot deleted the test_559256044 branch October 4, 2023 21:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants