You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
Currently, when we need to pass a float we'd have to pass it as a string or rescale it into an integer and then scale it back into a float in the model files.
Describe the solution you'd like
A clear and concise description of what you want to happen.
Hello @wemoveon2, unfortunately we had to prioritize other urgent items over the last few releases. This feature is not currently planned for a release, but I hope to have an update for you soon! When I do, I'll link you to the corresponding PR.
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
Currently, when we need to pass a float we'd have to pass it as a string or rescale it into an integer and then scale it back into a float in the model files.
Describe the solution you'd like
A clear and concise description of what you want to happen.
Would like for
InferParameter
to supportdouble_param
as a custom parameterhttps://github.com/triton-inference-server/common/blob/468eb216b01b375b83590b1e206d1c8cf4160b38/protobuf/grpc_service.proto#L470-L476
Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
Currently converting to a string and then converting it back before it's used, which isn't ideal.
Additional context
Add any other context or screenshots about the feature request here.
The text was updated successfully, but these errors were encountered: