Skip to content

[Main] fix cg missing wgrad hook#3074

Merged
jiemingz merged 9 commits intoNVIDIA:mainfrom
Wohox:pingtian/fix_cg_missing_wgrad_hook_main
Feb 5, 2026
Merged

[Main] fix cg missing wgrad hook#3074
jiemingz merged 9 commits intoNVIDIA:mainfrom
Wohox:pingtian/fix_cg_missing_wgrad_hook_main

Conversation

@Wohox
Copy link
Copy Markdown
Contributor

@Wohox Wohox commented Jan 26, 2026

What does this PR do ?

This PR fixes the missing backward post hook bug, this bug happens when delay_wgrad_compute and cuda_graph are both enabled, the parameters within delayed wgrad computation scope will skip the backward post hook.

The fix is in TE, this PR only adds version guard and several flag assertions.
Related TE PR: NVIDIA/TransformerEngine#2614
PR for dev: #2999

Contribution process

flowchart LR
    A[Pre-checks] --> B[PR Tests]
    subgraph Code Review/Approval
        C1[Expert Review] --> C2[Final Review]
    end
    B --> C1
    C2 --> D[Merge]
Loading

Pre-checks

  • I want this PR in a versioned release and have added the appropriate Milestone (e.g., Core 0.8)
  • I have added relevant unit tests
  • I have added relevant functional tests
  • I have added proper typing to my code Typing guidelines
  • I have added relevant documentation
  • I have run the autoformatter.sh on my PR

Code review

The following process is enforced via the CODEOWNERS file for changes into megatron/core. For changes outside of megatron/core, it is up to the PR author whether or not to tag the Final Reviewer team.

For MRs into `main` branch

Feel free to message or comment the @mcore-oncall to help accelerate your merge into main. The less complex your PR is, the faster it will be approved and merged!

(Step 1): Add PR label Expert Review

(Step 2): Collect the expert reviewers reviews

  1. Attach the Expert Review label when your PR is ready for review.
  2. GitHub auto-assigns expert reviewers based on your changes. They will get notified and pick up your PR soon.

⚠️ Only proceed to the next step once all reviewers have approved, merge-conflict are resolved and the CI is passing.
Final Review might get declined if these requirements are not fulfilled.

(Step 3): Final Review

  1. Add Final Review label
  2. GitHub auto-assigns final reviewers based on your changes. They will get notified and pick up your PR soon.

(Optional Step 4): Cherry-pick into release branch

If this PR also needs to be merged into core_r* release branches, after this PR has been merged, select Cherry-pick to open a new PR into the release branch.

For MRs into `dev` branch The proposed review process for `dev` branch is under active discussion.

MRs are mergable after one approval by either eharper@nvidia.com or zijiey@nvidia.com.

Merging your PR

Any member of core-adlr and core-nemo will be able to merge your PR.

@copy-pr-bot
Copy link
Copy Markdown

copy-pr-bot Bot commented Jan 26, 2026

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@Wohox Wohox mentioned this pull request Jan 26, 2026
6 tasks
@Wohox Wohox added the bug Something isn't working label Jan 26, 2026
@Wohox Wohox changed the title [Draft][Main] fix cg missing wgrad hook (Draft)[Main] fix cg missing wgrad hook Jan 26, 2026
@Wohox Wohox marked this pull request as ready for review January 26, 2026 07:39
@Wohox Wohox requested review from a team as code owners January 26, 2026 07:39
@ko3n1g ko3n1g requested a review from a team January 26, 2026 07:40
@Wohox Wohox changed the title (Draft)[Main] fix cg missing wgrad hook [Main] fix cg missing wgrad hook Jan 27, 2026
@Wohox
Copy link
Copy Markdown
Contributor Author

Wohox commented Jan 30, 2026

@jiemingz Can you take a look at this PR, thanks~

@jiemingz jiemingz self-requested a review January 30, 2026 15:04
assert is_te_min_version("2.8.0"), (
"overlap_grad_reduce is only supported with TE >= 2.8.0 when enabling delay_wgrad_compute"
)
wgrad_in_graph_scope = CudaGraphScope.attn in args.cuda_graph_scope or (
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM; could you move this to transformer_config? I think we're trying to keep the arguments.py minimal

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am afraid not, since TransformerConfig cannot access overlap_grad_reduce, which is DDPConfig

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Plug/FYI] Megatron-bridge has a validate() for all the configs together.
https://github.com/NVIDIA-NeMo/Megatron-Bridge/blob/main/src/megatron/bridge/training/config.py#L1399-L1661

We are trying to upstream to megatron-lm, this should help cross-config object validations.

@Phlip79 Phlip79 added Final Review PR is in the "final review" stage complexity: low labels Feb 4, 2026
@ericharper
Copy link
Copy Markdown
Contributor

@gautham-kollu , will Bridge need the same fix?

@gautham-kollu
Copy link
Copy Markdown
Contributor

@gautham-kollu , will Bridge need the same fix?

Yes. We need to upstream this to the ConfigContainer.Validate(). Filed this to track NVIDIA-NeMo/Megatron-Bridge#2216

@gautham-kollu
Copy link
Copy Markdown
Contributor

/ok to test 5973386

@Phlip79 Phlip79 added this pull request to the merge queue Feb 4, 2026
github-merge-queue Bot pushed a commit that referenced this pull request Feb 4, 2026
@github-merge-queue github-merge-queue Bot removed this pull request from the merge queue due to failed status checks Feb 4, 2026
@Phlip79 Phlip79 enabled auto-merge February 5, 2026 01:26
@Phlip79 Phlip79 disabled auto-merge February 5, 2026 01:36
@Phlip79 Phlip79 enabled auto-merge February 5, 2026 01:37
@Phlip79 Phlip79 disabled auto-merge February 5, 2026 01:40
@Phlip79
Copy link
Copy Markdown
Member

Phlip79 commented Feb 5, 2026

/ok to test 8553313

@jiemingz jiemingz added this pull request to the merge queue Feb 5, 2026
Merged via the queue into NVIDIA:main with commit 1934391 Feb 5, 2026
45 checks passed
daiyaanarfeen pushed a commit to daiyaanarfeen/Megatron-LM that referenced this pull request Feb 23, 2026
Co-authored-by: Philip Petrakian <ppetrakian@nvidia.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working complexity: low Final Review PR is in the "final review" stage

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants