-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
about inference time #86
Comments
0.15s |
2022-04-29 13:20:12.044 | INFO | cvpods.evaluation.evaluator:inference_on_dataset:112 - Start inference on 399 data samples 2022-04-29 13:20:41.836 | INFO | cvpods.engine.runner:test:401 - Evaluation results for coco_2017_val in csv format: i found 'Total inference time: 0:00:29.028718 (0.073677 s / sample per device, on 1 devices)' |
0.0736 second per sample per device. |
yeah, i have only one gpu |
where is the inference time in log.txt
2022-04-29 13:20:41.838 | INFO | cvpods.utils.dump.events:write:253 - eta: 0:00:00 iter: 90000/90000 total_loss: 0.106 loss_cls: 0.001 loss_box_reg: 0.102 num_fg_per_gt: 1.000 time: 0.1519 data_time: 0.0010 lr: 0.000006 max_mem: 1487M
time: 0.1519?
0.1519s?151.9ms?
The text was updated successfully, but these errors were encountered: