Skip to content

feat(packagesettings): add dependency-chained build tags#1053

Open
dhellmann wants to merge 1 commit intopython-wheel-build:mainfrom
dhellmann:build-dependency-changelog-chaining
Open

feat(packagesettings): add dependency-chained build tags#1053
dhellmann wants to merge 1 commit intopython-wheel-build:mainfrom
dhellmann:build-dependency-changelog-chaining

Conversation

@dhellmann
Copy link
Copy Markdown
Member

@dhellmann dhellmann commented Apr 13, 2026

Implements support for explicit package dependencies in settings to enable recursive build tag calculation. When a dependency's changelog changes, all packages that depend on it automatically get incremented build tags.

Key features:

  • New 'dependencies' field in PackageSettings (list of package names)
  • Recursive build tag calculation: own_changelog + sum(dep_build_tags)
  • Transitive dependency resolution through entire dependency chain
  • Circular dependency detection with clear error messages
  • Support for "fake packages" (platform dependencies like CUDA, ROCm)
  • Fully backward compatible (empty dependencies = original behavior)

Closes: #478

Implements support for explicit package dependencies in settings to
enable recursive build tag calculation. When a dependency's changelog
changes, all packages that depend on it automatically get incremented
build tags.

Key features:
- New 'dependencies' field in PackageSettings (list of package names)
- Recursive build tag calculation: own_changelog + sum(dep_build_tags)
- Transitive dependency resolution through entire dependency chain
- Circular dependency detection with clear error messages
- Support for "fake packages" (platform dependencies like CUDA, ROCm)
- Fully backward compatible (empty dependencies = original behavior)

Implementation:
- PackageSettings.dependencies field with comprehensive docstring
- PackageBuildInfo._calculate_build_tag() for recursive logic
- 9 new unit tests covering all edge cases
- 6 test data files for dependency chains and circular dependencies
- Comprehensive how-to guide with examples and best practices

All 458 project tests pass, type checking clean, documentation builds
successfully.

Closes: python-wheel-build#478

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Signed-off-by: Doug Hellmann <dhellmann@redhat.com>
@dhellmann dhellmann requested a review from a team as a code owner April 13, 2026 14:03
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Apr 13, 2026

📝 Walkthrough

Walkthrough

This pull request adds support for dependency-chained build tags in the fromager package management system. The changes introduce a new dependencies field to package settings that enables automatic rebuild triggering when dependency build tags change. The implementation recursively calculates a package's build tag as its own changelog entry count plus the sum of its transitive dependencies' build tags. Circular dependency detection is included to prevent infinite loops. New documentation, core logic updates, comprehensive test coverage, and test fixture data are provided.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~22 minutes

🚥 Pre-merge checks | ✅ 4
✅ Passed checks (4 passed)
Check name Status Explanation
Title check ✅ Passed The title 'feat(packagesettings): add dependency-chained build tags' clearly describes the main change: adding a dependency-chaining feature for build tag calculation in PackageSettings.
Description check ✅ Passed The description explains the implementation of recursive build tag calculation based on package dependencies, which directly corresponds to the changeset including new dependencies field, recursive build tag logic, and dependency tracking.
Linked Issues check ✅ Passed The PR successfully implements the requirement from #478 to trigger rebuilds when build-time dependencies change by adding a dependencies field with recursive build tag calculation that propagates changes through the dependency chain.
Out of Scope Changes check ✅ Passed All changes are within scope: documentation of the feature, PackageSettings model extension, build tag calculation logic, comprehensive tests, and test data—all directly supporting the dependency-chaining build tags feature requested in #478.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@mergify mergify bot added the ci label Apr 13, 2026
Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/fromager/packagesettings/_pbi.py`:
- Around line 341-343: The loop is passing the parent package's version into
dependency tag calculation (self._ps.dependencies -> package_build_info ->
_calculate_build_tag(version,...)), which prevents dependencies from using their
own resolved versions; change it so each dep_pbi is called with its own resolved
version (e.g., obtain dep_version from the dependency build-info object or
resolved-requirements API on self._settings/dep_pbi and pass that to
dep_pbi._calculate_build_tag(dep_version, visited=visited)) so each dependency
is resolved against its correct version rather than the root package's version.

In `@tests/test_packagesettings.py`:
- Around line 919-977: Add a regression test that exercises a parent→dependency
chain where the parent is queried with one Version and the dependency has its
changelog keyed to a different Version: create a new test (e.g.,
test_build_tag_mixed_versions_dependency) that uses
testdata_context.settings.package_build_info(...) to load the parent and
dependency packages, call parent_pbi.build_tag(Version("parent-version")) while
the dependency has changelog entries under Version("dependency-version"), and
assert the returned build_tag tuple includes the dependency's changelog
contribution aggregated correctly (use the existing Version and
package_build_info symbols to locate code and validate behavior).
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 35469bff-1c3f-4b36-b71a-e8ab3f21de3b

📥 Commits

Reviewing files that changed from the base of the PR and between b8f4441 and 58840f8.

📒 Files selected for processing (11)
  • docs/how-tos/dependency-build-tags.rst
  • docs/how-tos/index.rst
  • src/fromager/packagesettings/_models.py
  • src/fromager/packagesettings/_pbi.py
  • tests/test_packagesettings.py
  • tests/testdata/context/overrides/settings/test_circular_a.yaml
  • tests/testdata/context/overrides/settings/test_circular_b.yaml
  • tests/testdata/context/overrides/settings/test_dep_chain_a.yaml
  • tests/testdata/context/overrides/settings/test_dep_chain_b.yaml
  • tests/testdata/context/overrides/settings/test_dep_chain_c.yaml
  • tests/testdata/context/overrides/settings/test_fake_cuda.yaml

Comment on lines +341 to +343
for dep_pkg in self._ps.dependencies:
dep_pbi = self._settings.package_build_info(dep_pkg)
dep_tag = dep_pbi._calculate_build_tag(version, visited=visited)
Copy link
Copy Markdown

@coderabbitai coderabbitai bot Apr 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Resolve each dependency against its own version.

Line 343 passes the parent package's version into every dependency lookup. That means a build like torch==2.0.0 can only see changelog entries keyed as 2.0.0 in cuda-toolkit or triton, so the mixed-version / fake-package cases described in this PR never contribute to the tag. build_tag() needs dependency-specific versions (or resolved requirements), not a single root version threaded through the whole recursion.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/fromager/packagesettings/_pbi.py` around lines 341 - 343, The loop is
passing the parent package's version into dependency tag calculation
(self._ps.dependencies -> package_build_info ->
_calculate_build_tag(version,...)), which prevents dependencies from using their
own resolved versions; change it so each dep_pbi is called with its own resolved
version (e.g., obtain dep_version from the dependency build-info object or
resolved-requirements API on self._settings/dep_pbi and pass that to
dep_pbi._calculate_build_tag(dep_version, visited=visited)) so each dependency
is resolved against its correct version rather than the root package's version.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is something we should fix

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧩 Analysis chain

Let me look at the Settings class and the test cases to understand the intended behavior and available APIs.


🏁 Script executed:

#!/bin/bash
# Look at the Settings class to understand package_build_info and version resolution
fd '_settings.py' --type f | head -5

Length of output: 41


🏁 Script executed:

#!/bin/bash
ast-grep --pattern 'def package_build_info($$$)'

Length of output: 1391


🏁 Script executed:

#!/bin/bash
# Look at the test for fake_cuda to understand the expected behavior
rg -n 'fake_cuda\|dep_chain\|test_build_tag' tests/test_packagesettings.py -A 10

Length of output: 0


🏁 Script executed:

#!/bin/bash
# Find build_tag related tests
grep -n "build_tag\|fake_cuda\|dep_chain\|circular" tests/test_packagesettings.py | head -60

Length of output: 2209


🏁 Script executed:

#!/bin/bash
# Look at the fake_cuda test data
cat tests/testdata/context/overrides/settings/test_fake_cuda.yaml
echo "---"
cat tests/testdata/context/overrides/settings/test_dep_chain_a.yaml
echo "---"
cat tests/testdata/context/overrides/settings/test_dep_chain_b.yaml
echo "---"
cat tests/testdata/context/overrides/settings/test_dep_chain_c.yaml

Length of output: 314


🏁 Script executed:

#!/bin/bash
# Show the new test functions in full
grep -n "def test_" tests/test_packagesettings.py | tail -30

Length of output: 1850


🏁 Script executed:

#!/bin/bash
# Check PackageSettings.changelog field type and structure
grep -n "changelog" src/fromager/packagesettings/_models.py | head -30

Length of output: 88


🏁 Script executed:

#!/bin/bash
# Show the full test functions for build_tag
sed -n '919,1025p' tests/test_packagesettings.py

Length of output: 4068


@rd4398, the bug is confirmed and worth fixing. Here's the full analysis:

The problem is real but currently hidden. All dep_chain tests use Version("1.0.0") which happens to match the changelog keys in all dep_chain YAML fixtures — so the version is coincidentally correct. The fake_cuda tests only exercise that package directly; there's no test for a package that depends on test-fake-cuda. If torch==2.0.0 listed test-fake-cuda as a dependency, _calculate_build_tag would call dep_pbi._calculate_build_tag(Version("2.0.0"), ...), find no entry for "2.0.0" in the CUDA changelog, and contribute 0.

The right fix: For dependency contributions, don't pass the parent's version at all — instead sum all changelog entries across all versions of the dependency. The semantics shift from "what's the build tag of this dep at version X" to "how much has this dep chain changed in total". This is version-agnostic, which is consistent with the dependencies field having no version specifier.

Here's the suggested implementation:

def _calculate_build_tag(
    self, version: Version, visited: set[NormalizedName]
) -> BuildTag:
    if self.pre_built:
        return ()

    if self.package in visited:
        raise ValueError(
            f"Circular dependency detected: {self.package} appears in "
            f"dependency chain: {' -> '.join(sorted(visited))} -> {self.package}"
        )

    visited = visited | {self.package}

    # Own changelog: version-specific (existing behavior preserved)
    pv = typing.cast(PackageVersion, version)
    own_changelog_count = len(self.get_changelog(pv))

    # Dependency contribution: version-agnostic total across all versions
    dependency_contribution = self._dependency_changelog_total(visited)

    total = own_changelog_count + dependency_contribution
    if total == 0:
        return ()
    return total, ""


def _dependency_changelog_total(self, visited: set[NormalizedName]) -> int:
    """Sum of all changelog entries across all versions for this package's
    dependencies, used to compute version-agnostic dependency contributions."""
    total = 0
    for dep_pkg in self._ps.dependencies:
        dep_pbi = self._settings.package_build_info(dep_pkg)
        total += dep_pbi._own_total_changelog_count(visited)
    return total


def _own_total_changelog_count(self, visited: set[NormalizedName]) -> int:
    """Recursively sum all changelog entries across all versions for
    cycle-safe, version-agnostic dependency contribution calculation."""
    if self.pre_built:
        return 0

    if self.package in visited:
        raise ValueError(
            f"Circular dependency detected: {self.package} appears in "
            f"dependency chain: {' -> '.join(sorted(visited))} -> {self.package}"
        )

    visited = visited | {self.package}

    # Sum own entries across ALL versions (version-agnostic for dep chains)
    own_count = len(self._variant_changelog)
    own_count += sum(len(entries) for entries in self._ps.changelog.values())

    # Recurse into transitive dependencies
    for dep_pkg in self._ps.dependencies:
        dep_pbi = self._settings.package_build_info(dep_pkg)
        own_count += dep_pbi._own_total_changelog_count(visited)

    return own_count

Why this is correct for each case:

  • test_dep_chain_c (no deps, 1 entry for "1.0.0"): _own_total_changelog_count = 1 → same as before
  • test_dep_chain_b (depends on C, 1 own entry): 1 + 1 = 2 → same as before
  • test-fake-cuda as a dependency: sums entries for "12.9" + "12.8" = 2, increasing monotonically as new CUDA versions are added

You'd also need a new test like:

def test_build_tag_with_fake_package_dependency(testdata_context):
    """A package depending on fake-cuda should accumulate total CUDA changelog entries."""
    # Create a package that lists test-fake-cuda as a dependency
    ps = PackageSettings.from_string("test-torch", """
dependencies:
  - test-fake-cuda
changelog:
  "2.0.0":
    - "torch release"
""")
    settings = Settings(
        settings=SettingsFile(),
        package_settings=[ps],
        variant="cpu",
        patches_dir=testdata_context.settings.patches_dir,
        max_jobs=1,
    )
    # fake-cuda has 2 total entries (12.9 + 12.8), torch has 1 own = 3
    pbi = settings.package_build_info("test-torch")
    assert pbi.build_tag(Version("2.0.0")) == (3, "")
  • Refactor _calculate_build_tag to delegate dependency contribution to a new version-agnostic helper
  • Add _dependency_changelog_total and _own_total_changelog_count helpers in _pbi.py
  • Add a missing test covering a package that depends on a "fake" platform package like test-fake-cuda

🏁 Script executed:

#!/bin/bash
# Confirm the exact line range of _calculate_build_tag and related methods
grep -n "_calculate_build_tag\|_dependency_changelog_total\|_own_total_changelog_count\|def build_tag" src/fromager/packagesettings/_pbi.py

Length of output: 242


Plan: Fix version-agnostic dependency changelog accumulation

  • Refactor _calculate_build_tag in _pbi.py to separate own (versioned) count from dependency (version-agnostic) count
  • Add _own_total_changelog_count recursive helper for summing all versions of a dep
  • Add test fixture YAML for a package that depends on test-fake-cuda
  • Add a test asserting the correct cross-version accumulation

Failed to handle agent chat message. Please try again.

Comment on lines +919 to +977
def test_build_tag_no_dependencies(testdata_context: context.WorkContext) -> None:
"""Test build tag without dependencies (backward compatibility)."""
pbi = testdata_context.settings.package_build_info("test-empty-pkg")
# Empty package has no changelog entries
assert pbi.build_tag(Version("1.0.0")) == ()


def test_build_tag_simple_dependency(testdata_context: context.WorkContext) -> None:
"""Test build tag with a simple dependency chain: C (no deps)."""
pbi = testdata_context.settings.package_build_info("test-dep-chain-c")
# C has 1 changelog entry, no dependencies
assert pbi.build_tag(Version("1.0.0")) == (1, "")


def test_build_tag_transitive_dependencies(
testdata_context: context.WorkContext,
) -> None:
"""Test build tag includes all transitive dependencies: A -> B -> C."""
# C: 1 changelog entry, no dependencies = 1
pbi_c = testdata_context.settings.package_build_info("test-dep-chain-c")
assert pbi_c.build_tag(Version("1.0.0")) == (1, "")

# B: 1 changelog entry + 1 from C = 2
pbi_b = testdata_context.settings.package_build_info("test-dep-chain-b")
assert pbi_b.build_tag(Version("1.0.0")) == (2, "")

# A: 1 changelog entry + 2 from B (which includes C) = 3
pbi_a = testdata_context.settings.package_build_info("test-dep-chain-a")
assert pbi_a.build_tag(Version("1.0.0")) == (3, "")


def test_build_tag_circular_dependency(testdata_context: context.WorkContext) -> None:
"""Test circular dependency detection: A -> B -> A."""
pbi = testdata_context.settings.package_build_info("test-circular-a")
with pytest.raises(ValueError, match="Circular dependency detected"):
pbi.build_tag(Version("1.0.0"))


def test_build_tag_fake_package(testdata_context: context.WorkContext) -> None:
"""Test fake package (platform dependency with settings but no source)."""
pbi = testdata_context.settings.package_build_info("test-fake-cuda")
# Fake package has changelog for version 12.9 (1 entry)
assert pbi.build_tag(Version("12.9")) == (1, "")
# Fake package has changelog for version 12.8 (1 entry)
assert pbi.build_tag(Version("12.8")) == (1, "")
# No changelog for other versions
assert pbi.build_tag(Version("1.0.0")) == ()


def test_build_tag_with_pre_built_dependency(
testdata_context: context.WorkContext,
) -> None:
"""Test that pre-built dependencies contribute 0 to build tag."""
# test-prebuilt-pkg has pre_built: True for cpu variant
# Even though it has a changelog, pre-built packages return ()
pbi = testdata_context.settings.package_build_info("test-prebuilt-pkg")
assert pbi.pre_built is True
assert pbi.build_tag(Version("1.0.1")) == ()

Copy link
Copy Markdown

@coderabbitai coderabbitai bot Apr 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Add a mixed-version dependency-chain regression test.

These cases only cover chains where every package is queried with Version("1.0.0"), or they build the fake package directly. That won't protect the documented CUDA/ROCm use case where the parent and dependency versions differ, so this suite can miss the core failure mode here. Please add one parent→dependency case where the dependency changelog is keyed to its own version.

As per coding guidelines, "Verify test actually tests the intended behavior. Check for missing edge cases."

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/test_packagesettings.py` around lines 919 - 977, Add a regression test
that exercises a parent→dependency chain where the parent is queried with one
Version and the dependency has its changelog keyed to a different Version:
create a new test (e.g., test_build_tag_mixed_versions_dependency) that uses
testdata_context.settings.package_build_info(...) to load the parent and
dependency packages, call parent_pbi.build_tag(Version("parent-version")) while
the dependency has changelog entries under Version("dependency-version"), and
assert the returned build_tag tuple includes the dependency's changelog
contribution aggregated correctly (use the existing Version and
package_build_info symbols to locate code and validate behavior).

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add a test here something like:

def test_build_tag_dependency_different_version():                                                                                                                                                                                
      """Test that dependencies use their own versions, not parent's."""                                                                                                                                                            
      # CUDA has changelog at 12.9, torch at 2.0.0                                                                                                                                                                                  
      pbi_cuda = testdata_context.settings.package_build_info("test-fake-cuda")                                                                                                                                                     
      assert pbi_cuda.build_tag(Version("12.9")) == (1, "")

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧩 Analysis chain

Let me check the actual implementation to confirm how version is propagated to dependencies before proposing the test.


🏁 Script executed:

#!/bin/bash
fd "_pbi.py" --exec cat {}

Length of output: 16671


@rd4398, happy to help. One thing to note first: the Version("12.9") assertion in your suggested test is already covered by the existing test_build_tag_fake_package. The real gap the original review comment flagged is a parent → fake-cuda case where the parent is queried at one version and the fake package's changelog is keyed to a different version.

Looking at _calculate_build_tag in _pbi.py:

dep_tag = dep_pbi._calculate_build_tag(version, visited=visited)

The same version is propagated unchanged to all dependencies. This means parent.build_tag(Version("2.0.0")) will look up test-fake-cuda's changelog at "2.0.0", not at "12.9" — so the CUDA contribution would be zero. The test below captures this important behavior:

def test_build_tag_dependency_different_version(
    testdata_context: context.WorkContext,
    tmp_path: pathlib.Path,
) -> None:
    """Test version propagation through the dependency chain.

    The version passed to build_tag() propagates unchanged to all
    dependencies. A fake/platform package (e.g. CUDA) must therefore key its
    changelog entries to the *dependent* package's version (not its own
    platform version) to contribute to the parent's build tag.
    """
    # Create a parent that depends on test-fake-cuda.
    # test-fake-cuda has changelog entries at "12.9" and "12.8".
    parent_ps = PackageSettings.from_string(
        "torch-like-pkg",
        """
dependencies:
  - test-fake-cuda
changelog:
  "2.0.0":
    - "torch 2.0.0 built with CUDA 12.9"
""",
    )

    # Add a second parent that shares the SAME version namespace as fake-cuda.
    parent_cuda_versioned_ps = PackageSettings.from_string(
        "cuda-versioned-parent",
        """
dependencies:
  - test-fake-cuda
changelog:
  "12.9":
    - "parent entry at 12.9"
""",
    )

    settings = Settings(
        settings=SettingsFile(),
        package_settings=[
            parent_ps,
            parent_cuda_versioned_ps,
            # Re-use the already-loaded test-fake-cuda settings.
            testdata_context.settings.package_build_info("test-fake-cuda")._ps,
        ],
        variant="cpu",
        patches_dir=tmp_path,
        max_jobs=1,
    )

    # Parent at "2.0.0": own 1 entry, but fake-cuda has NO changelog at 2.0.0
    # → dependency contributes 0 → total = 1.
    parent_pbi = settings.package_build_info("torch-like-pkg")
    assert parent_pbi.build_tag(Version("2.0.0")) == (1, "")

    # Parent at "12.9": own 1 entry + fake-cuda 1 entry at "12.9" → total = 2.
    cuda_parent_pbi = settings.package_build_info("cuda-versioned-parent")
    assert cuda_parent_pbi.build_tag(Version("12.9")) == (2, "")

The two assertions together document the version-propagation contract:

  • When the parent is at "2.0.0" and fake-cuda only has entries at "12.9", CUDA contributes nothing.
  • When both the parent and the fake-cuda changelog share version "12.9", the dependency contribution is counted.

This makes explicit the requirement for the docs' how-to guide: fake/platform packages must key their changelog entries to the consuming package's version, not the platform's own version string.


🧠 Learnings used
Learnt from: smoparth
Repo: python-wheel-build/fromager PR: 1030
File: tests/test_suggest_collection.py:5-5
Timestamp: 2026-04-08T22:13:20.914Z
Learning: In fromager CLI test files (e.g., tests/test_suggest_collection.py and other tests/test_*.py), it’s acceptable to extract JSON arrays from mixed stdout using the existing helper `_extract_json_from_output(output)` (implemented with `re.search`). This is an intentional project pattern driven by the shared `cli_runner` fixture using Click’s default `mix_stderr=True`, which can inline log/warning messages into `result.output`/`result.stdout`. Don’t flag this approach as fragile as long as the helper is used consistently and the extraction targets the expected JSON array format.

Learnt from: mnadzam
Repo: python-wheel-build/fromager PR: 1034
File: tests/test_hooks.py:234-259
Timestamp: 2026-04-09T08:58:42.152Z
Learning: In this project’s Python test suite (e.g., under tests/), avoid asserting on exact log output strings since they are brittle implementation details. Prefer verifying functional behavior instead—e.g., assert that collaborator functions are called the expected number of times with the expected arguments (such as `_get_dist_info`).

Copy link
Copy Markdown
Contributor

@rd4398 rd4398 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall looks good except one thing that we should fix before we can land this

Comment on lines +341 to +343
for dep_pkg in self._ps.dependencies:
dep_pbi = self._settings.package_build_info(dep_pkg)
dep_tag = dep_pbi._calculate_build_tag(version, visited=visited)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is something we should fix

Comment on lines +919 to +977
def test_build_tag_no_dependencies(testdata_context: context.WorkContext) -> None:
"""Test build tag without dependencies (backward compatibility)."""
pbi = testdata_context.settings.package_build_info("test-empty-pkg")
# Empty package has no changelog entries
assert pbi.build_tag(Version("1.0.0")) == ()


def test_build_tag_simple_dependency(testdata_context: context.WorkContext) -> None:
"""Test build tag with a simple dependency chain: C (no deps)."""
pbi = testdata_context.settings.package_build_info("test-dep-chain-c")
# C has 1 changelog entry, no dependencies
assert pbi.build_tag(Version("1.0.0")) == (1, "")


def test_build_tag_transitive_dependencies(
testdata_context: context.WorkContext,
) -> None:
"""Test build tag includes all transitive dependencies: A -> B -> C."""
# C: 1 changelog entry, no dependencies = 1
pbi_c = testdata_context.settings.package_build_info("test-dep-chain-c")
assert pbi_c.build_tag(Version("1.0.0")) == (1, "")

# B: 1 changelog entry + 1 from C = 2
pbi_b = testdata_context.settings.package_build_info("test-dep-chain-b")
assert pbi_b.build_tag(Version("1.0.0")) == (2, "")

# A: 1 changelog entry + 2 from B (which includes C) = 3
pbi_a = testdata_context.settings.package_build_info("test-dep-chain-a")
assert pbi_a.build_tag(Version("1.0.0")) == (3, "")


def test_build_tag_circular_dependency(testdata_context: context.WorkContext) -> None:
"""Test circular dependency detection: A -> B -> A."""
pbi = testdata_context.settings.package_build_info("test-circular-a")
with pytest.raises(ValueError, match="Circular dependency detected"):
pbi.build_tag(Version("1.0.0"))


def test_build_tag_fake_package(testdata_context: context.WorkContext) -> None:
"""Test fake package (platform dependency with settings but no source)."""
pbi = testdata_context.settings.package_build_info("test-fake-cuda")
# Fake package has changelog for version 12.9 (1 entry)
assert pbi.build_tag(Version("12.9")) == (1, "")
# Fake package has changelog for version 12.8 (1 entry)
assert pbi.build_tag(Version("12.8")) == (1, "")
# No changelog for other versions
assert pbi.build_tag(Version("1.0.0")) == ()


def test_build_tag_with_pre_built_dependency(
testdata_context: context.WorkContext,
) -> None:
"""Test that pre-built dependencies contribute 0 to build tag."""
# test-prebuilt-pkg has pre_built: True for cpu variant
# Even though it has a changelog, pre-built packages return ()
pbi = testdata_context.settings.package_build_info("test-prebuilt-pkg")
assert pbi.pre_built is True
assert pbi.build_tag(Version("1.0.1")) == ()

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add a test here something like:

def test_build_tag_dependency_different_version():                                                                                                                                                                                
      """Test that dependencies use their own versions, not parent's."""                                                                                                                                                            
      # CUDA has changelog at 12.9, torch at 2.0.0                                                                                                                                                                                  
      pbi_cuda = testdata_context.settings.package_build_info("test-fake-cuda")                                                                                                                                                     
      assert pbi_cuda.build_tag(Version("12.9")) == (1, "")

@tiran
Copy link
Copy Markdown
Collaborator

tiran commented Apr 14, 2026

Let's assume we want to rebuild vLLM 0.3.0 with Torch 2.1.0. Let's also assume that Torch 2.0.0 and 2.1.0 have both a single changelog entry. Don't we end up with the same build tag "7" for vLLM 0.3.0?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

trigger rebuilds of packages when one of their build time dependency is rebuilt

3 participants