Skip to content

Commit

Permalink
Only store channel if use_cached_channel is true (#387)
Browse files Browse the repository at this point in the history
The gRPC client exposes a parameter that disables reusing a cached
channel: use_cached_channel. Setting this to false doesn't stop the
channel (and mock) from being inserted to the internal channel cache. 
The problem with it being inserted to the map means the channel will
only get destroyed once it's being replaced by another channel.

In our case we want the channel (and associated connection) to be closed
 as soon as the client using it is getting destroyed though, meaning we don't
 want to reuse the channel at all. I believe this is also consistent with the 
description for use_cached_channel:

If false, a new channel is created for each
new client instance. When true, re-use old channels from cache for new
client instances. The default value is true
  • Loading branch information
HennerM authored and mc-nv committed Apr 18, 2024
1 parent 169afc2 commit a771e2c
Showing 1 changed file with 9 additions and 6 deletions.
15 changes: 9 additions & 6 deletions src/c++/library/grpc_client.cc
Original file line number Diff line number Diff line change
Expand Up @@ -135,12 +135,15 @@ GetStub(
grpc::CreateCustomChannel(url, credentials, arguments);
std::shared_ptr<inference::GRPCInferenceService::Stub> stub =
inference::GRPCInferenceService::NewStub(channel);
// Replace if channel / stub have been in the map
if (channel_itr != grpc_channel_stub_map_.end()) {
channel_itr->second = std::make_tuple(1, channel, stub);
} else {
grpc_channel_stub_map_.insert(
std::make_pair(url, std::make_tuple(1, channel, stub)));

if (use_cached_channel) {
// Replace if channel / stub have been in the map
if (channel_itr != grpc_channel_stub_map_.end()) {
channel_itr->second = std::make_tuple(1, channel, stub);
} else {
grpc_channel_stub_map_.insert(
std::make_pair(url, std::make_tuple(1, channel, stub)));
}
}

return stub;
Expand Down

0 comments on commit a771e2c

Please sign in to comment.