Skip to content

Fix mtp_use_repeated_layer behavior for GPT models#3965

Open
rkarimimahab wants to merge 12 commits intoNVIDIA:mainfrom
rkarimimahab:rkarimimahab/repeated
Open

Fix mtp_use_repeated_layer behavior for GPT models#3965
rkarimimahab wants to merge 12 commits intoNVIDIA:mainfrom
rkarimimahab:rkarimimahab/repeated

Conversation

@rkarimimahab
Copy link
Copy Markdown
Contributor

What does this PR do ?

⚠️ For major changes (either in lines of code or in its impact), please make sure to first share a design doc with the team. If you're unsure what's the best way to do so, contact the @mcore-oncall.

Contribution process

Pre-checks

  • I have added relevant unit tests
  • I have added relevant functional tests
  • I have added proper typing to my code Typing guidelines
  • I have added relevant documentation
  • I have run the autoformatter.sh on my PR

Code review

Feel free to message or comment the @mcore-oncall to help accelerate your merge into main. The less complex your PR is, the faster it will be approved and merged!

All PRs start as draft. If you open a non-draft PR, it will be automatically converted to draft.

Step 1: Mark PR as "Ready for Review"

  1. When your PR is ready, click Ready for Review.
  2. An oncall reviewer is auto-assigned and expert reviewers are notified based on your changes.
    • Some PRs may jump straight to step 2. This is determined by .github/CODEOWNERS.

⚠️ Only mark as ready once merge-conflicts are resolved and the CI is passing.
Final Review might get declined if these requirements are not fulfilled.

Step 2: Final Review

For PRs that change megatron/core, once all expert reviewers have approved, the Final Review label is applied automatically and final reviewers are assigned.

For PRs outside megatron/core, this step is skipped.

Step 3: Approved

Once all required reviewers have approved, the Approved label is applied automatically.

Merge

Any member of mcore-engineers will be able to merge your PR.

For MRs into `dev` branch The proposed review process for `dev` branch is under active discussion.

MRs are mergable after one approval by either eharper@nvidia.com or zijiey@nvidia.com.

@rkarimimahab rkarimimahab requested review from a team as code owners March 20, 2026 15:27
@copy-pr-bot
Copy link
Copy Markdown

copy-pr-bot Bot commented Mar 20, 2026

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@svcnvidia-nemo-ci svcnvidia-nemo-ci marked this pull request as draft March 20, 2026 15:27
@github-actions
Copy link
Copy Markdown
Contributor

This PR has been automatically converted to draft because all PRs must start as drafts.

When you are ready for review, click Ready for Review to begin the review process. This will:

  1. Add the oncall reviewer (optional reviewer)
  2. Add required review teams based on your changes

See the contribution guide for more details.

@deepakn94 deepakn94 changed the title added bug correction for mtp repeated Fix mtp_use_repeated_layer behavior for GPT models Mar 20, 2026
@deepakn94 deepakn94 marked this pull request as ready for review March 20, 2026 16:22
@svcnvidia-nemo-ci svcnvidia-nemo-ci requested a review from a team March 20, 2026 16:23
Comment thread megatron/core/models/gpt/gpt_layer_specs.py Outdated
@chtruong814 chtruong814 added the needs-follow-up Issue needs follow-up label Mar 22, 2026
@svcnvidia-nemo-ci svcnvidia-nemo-ci added the Final Review PR is in the "final review" stage label Mar 24, 2026
@chtruong814 chtruong814 removed the needs-follow-up Issue needs follow-up label Mar 25, 2026
@deepakn94
Copy link
Copy Markdown
Contributor

/ok to test bce7df2

@svcnvidia-nemo-ci svcnvidia-nemo-ci added this to the Core 0.16 milestone Mar 25, 2026
@svcnvidia-nemo-ci svcnvidia-nemo-ci added Approved All necessary approvals have been made and removed Final Review PR is in the "final review" stage labels Mar 25, 2026
@deepakn94
Copy link
Copy Markdown
Contributor

/ok to test 7b55eb9

@chtruong814 chtruong814 added the needs-follow-up Issue needs follow-up label Mar 27, 2026
@Phlip79
Copy link
Copy Markdown
Member

Phlip79 commented Apr 3, 2026

/ok to test 601030e

@Phlip79
Copy link
Copy Markdown
Member

Phlip79 commented Apr 7, 2026

/ok to test 57e3e05

@gautham-kollu
Copy link
Copy Markdown
Contributor

Tests seem to be cancelled for some reason, rerunning it

@Phlip79
Copy link
Copy Markdown
Member

Phlip79 commented Apr 9, 2026

/ok to test e51d78c

@yaox12
Copy link
Copy Markdown
Member

yaox12 commented Apr 10, 2026

/ok to test 06332d3

@Phlip79
Copy link
Copy Markdown
Member

Phlip79 commented Apr 12, 2026

/ok to test 9331b76

@Phlip79
Copy link
Copy Markdown
Member

Phlip79 commented Apr 13, 2026

/ok to test 8f41529

@yaox12
Copy link
Copy Markdown
Member

yaox12 commented Apr 14, 2026

It seems that the UT tests/unit_tests/pipeline_parallel/**/*.py always hang. Could be a bug in this PR.

@chtruong814 chtruong814 added waiting-for-customer Waiting for response from the original author and removed waiting-for-customer Waiting for response from the original author labels Apr 14, 2026
@guihong-nv
Copy link
Copy Markdown

@rkarimimahab The CI/CD fails. Please have a look.

@chtruong814 chtruong814 added waiting-on-customer Waiting on the original author to respond needs-follow-up Issue needs follow-up and removed needs-follow-up Issue needs follow-up waiting-on-customer Waiting on the original author to respond labels Apr 18, 2026
@svcnvidia-nemo-ci svcnvidia-nemo-ci added waiting-on-maintainers Waiting on maintainers to respond and removed needs-follow-up Issue needs follow-up labels Apr 21, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Approved All necessary approvals have been made community-request complexity: low waiting-on-maintainers Waiting on maintainers to respond

Projects

None yet

Development

Successfully merging this pull request may close these issues.

10 participants