Skip to content

Commit

Permalink
moe fp8 (#11687)
Browse files Browse the repository at this point in the history
* moe fp8

Signed-off-by: Malay Nagda <[email protected]>

* Apply isort and black reformatting

Signed-off-by: malay-nagda <[email protected]>

---------

Signed-off-by: Malay Nagda <[email protected]>
Signed-off-by: malay-nagda <[email protected]>
Co-authored-by: malay-nagda <[email protected]>
  • Loading branch information
malay-nagda and malay-nagda authored Dec 23, 2024
1 parent cbb5ff4 commit 9a2f0bd
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion nemo/collections/llm/gpt/model/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,10 @@ def transformer_engine_layer_spec(config: "GPTConfig") -> ModuleSpec:
from megatron.core.models.gpt import gpt_layer_specs

return gpt_layer_specs.get_gpt_layer_with_transformer_engine_spec(
num_experts=config.num_moe_experts, moe_grouped_gemm=config.moe_grouped_gemm, qk_layernorm=config.qk_layernorm
num_experts=config.num_moe_experts,
moe_grouped_gemm=config.moe_grouped_gemm,
qk_layernorm=config.qk_layernorm,
fp8=bool(config.num_moe_experts and (config.fp8 is not None)),
)


Expand Down

0 comments on commit 9a2f0bd

Please sign in to comment.