Skip to content

Commit

Permalink
Fix closeout error in vllm
Browse files Browse the repository at this point in the history
  • Loading branch information
ProbablyFaiz committed Oct 2, 2024
1 parent a2713be commit e04f4b2
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion rl/llm/engines/local.py
Original file line number Diff line number Diff line change
Expand Up @@ -392,7 +392,6 @@ def __exit__(self, exc_type, exc_value, traceback):
del self.vllm
gc.collect()
torch.cuda.empty_cache()
torch.distributed.destroy_process_group()
LOGGER.info("VLLM model unloaded.")

def generate(self, prompt: InferenceInput) -> InferenceOutput:
Expand Down

0 comments on commit e04f4b2

Please sign in to comment.