Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ImportError onnxparser #4

Open
lawrencekiba opened this issue Jul 5, 2019 · 6 comments
Open

ImportError onnxparser #4

lawrencekiba opened this issue Jul 5, 2019 · 6 comments

Comments

@lawrencekiba
Copy link

Have you encountered this issue? Had this when running python main.py, however I can import tensorrt as usual.

from tensorrt.parsers import onnxparser
ImportError: cannot import name 'onnxparser'
@modricwang
Copy link
Owner

Hi,
what's your version of CUDA, cuDNN and TensorRT?I will try to reproduce this environment issue.
Guess it is caused by a TensorRT interface update.

@gipb
Copy link

gipb commented Aug 21, 2019

In my case, I just changed from tensorrt.parsers to from tensorrt.legacy.parsers. Then I was able to import the onnxparser.

However, after I imported the onnxparser, I got the following error.

AttributeError Traceback (most recent call last)
in
10 output_names=output_names)
11
---> 12 onnx_infer()
13
14 if not os.path.exists(args.trt_model_name):

in onnx_infer()
97
98 def onnx_infer():
---> 99 apex = onnxparser.create_onnxconfig()
100 apex.set_model_file_name(args.onnx_model_name)
101 apex.set_model_dtype(trt.legacy.infer.DataType.FLOAT)

AttributeError: type object 'onnxparser' has no attribute 'create_onnxconfig'

What should I do?

I'm using CUDA 10.0, CuDNN 7.6.0 and TensorRT-5.1.5.0.

@zhouqzzw
Copy link

@gipb Have you solved this problem? I got the same problem.

@Dawnlnz
Copy link

Dawnlnz commented Dec 5, 2019

In my case, I just changed from tensorrt.parsers to from tensorrt.legacy.parsers. Then I was able to import the onnxparser.

However, after I imported the onnxparser, I got the following error.

AttributeError Traceback (most recent call last)
in
10 output_names=output_names)
11
---> 12 onnx_infer()
13
14 if not os.path.exists(args.trt_model_name):
in onnx_infer()
97
98 def onnx_infer():
---> 99 apex = onnxparser.create_onnxconfig()
100 apex.set_model_file_name(args.onnx_model_name)
101 apex.set_model_dtype(trt.legacy.infer.DataType.FLOAT)
AttributeError: type object 'onnxparser' has no attribute 'create_onnxconfig'

What should I do?

I'm using CUDA 10.0, CuDNN 7.6.0 and TensorRT-5.1.5.0.

I have the same problem,have you solved the problem?

@Tian14267
Copy link

I get this Error when I import calib:

Traceback (most recent call last):
  File "/home/fffan/fffan_files/Experiment/Pytorch-Model-to-TensorRT-master/main.py", line 16, in <module>
    import calib as calibrator
  File "/home/fffan/fffan_files/Experiment/Pytorch-Model-to-TensorRT-master/calib.py", line 13, in <module>
    class PythonEntropyCalibrator(trt.infer.EntropyCalibrator):
AttributeError: 'module' object has no attribute 'infer'

My tensorRT is 5.1 . Does somebody know new API in tensorRT 5.1?

@Tian14267
Copy link

@modricwang @lawrencekiba @ZuoweiZhou @Dawnlnz

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants