Skip to content

Commit

Permalink
don't check max_batch_size for cpu (#298)
Browse files Browse the repository at this point in the history
  • Loading branch information
yufenglee authored Apr 23, 2024
1 parent 352c1ec commit b1180a6
Showing 1 changed file with 2 additions and 4 deletions.
6 changes: 2 additions & 4 deletions src/models/model.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -559,10 +559,8 @@ void Model::GetMaxBatchSizeFromGeneratorParams(const GeneratorParams& params) {
}

use_cuda_graph_ = true;
} else {
if (is_cuda_graph_enabled || max_batch_size_ > 0) {
throw std::runtime_error("CUDA graph is not supported on this device");
}
} else if (is_cuda_graph_enabled) {
throw std::runtime_error("CUDA graph is not supported on this device");
}
}

Expand Down

0 comments on commit b1180a6

Please sign in to comment.