Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ascend910b] does not support opType [RmsNorm] #312

Open
Jayc-Z opened this issue Oct 30, 2024 · 1 comment
Open

[ascend910b] does not support opType [RmsNorm] #312

Jayc-Z opened this issue Oct 30, 2024 · 1 comment

Comments

@Jayc-Z
Copy link

Jayc-Z commented Oct 30, 2024

Environment

Hardware Environment(Ascend/GPU/CPU):

Uncomment only one /device <> line, hit enter to put that in a new line, and remove leading whitespaces from that line:

Ascend 910B

Software Environment:

  • MindSpore 2.4.0:
  • mindpet 1.0.4:
  • mindformers 1.3.0:
  • Python 3.10:
  • Ubuntu aarch64:

Describe the current behavior

利用mindformers/research/qwen1_5/run_qwen1_5_chat.py部署LLM,出现以下问题
image

Describe the expected behavior

预期能够正常部署

Steps to reproduce the issue

Related log / screenshot

Special notes for this issue

@longvoyage
Copy link
Contributor

请确认环境安装是否完全以及环境变量是否设置。
可以单跑rms_norm看下。

`import mindspore
import numpy as np
from mindspore import Tensor, ops
x = Tensor(np.array([[1, 2, 3], [1, 2, 3]]), mindspore.float32)
gamma = Tensor(np.ones([3]), mindspore.float32)
y, rstd = ops.rms_norm(x, gamma)
print(y)
print(rstd)

`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants