Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

明确bert中 layer的shape #492

Open
reporter-law opened this issue Sep 16, 2022 · 4 comments
Open

明确bert中 layer的shape #492

reporter-law opened this issue Sep 16, 2022 · 4 comments

Comments

@reporter-law
Copy link

提问时请尽可能提供如下信息:

基本信息

  • 你使用的操作系统: windows
  • 你使用的Python版本: python3.7
  • 你使用的Tensorflow版本: tensorflow 1.15.4+nv
  • 你使用的Keras版本: 2.2.4
  • 你使用的bert4keras版本: 0.11.3
  • 你使用纯keras还是tf.keras: keras
  • 你加载的预训练模型:roformer_v2

核心代码

 print(bert.output)

输出信息

Tensor("bertTransformer-5-FeedForward-Norm/truediv:0", shape=(?, ?, 384), dtype=float32)

自我尝试

需求:希望明确shape,主要是maxlen,形如Tensor("bertTransformer-5-FeedForward-Norm/truediv:0", shape=(?, 300, 384), dtype=float32)

@i4never
Copy link
Contributor

i4never commented Sep 16, 2022

是根据batch中的max确定的、为啥一定要明确?

@reporter-law
Copy link
Author

因为想对maxlen层处理,比如concatenate(axis=1)

@bojone
Copy link
Owner

bojone commented Sep 19, 2022

如果你的“明确”是指固定,那么可以在build_transformer_model的时候传入sequence_length=xxx

@reporter-law
Copy link
Author

如果你的“明确”是指固定,那么可以在build_transformer_model的时候传入sequence_length=xxx

是的,就是固定,感谢苏神的回复!!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants