Skip to content

Add release/summarize-changelog action for ref-range changelog compilation with template + LangChain multi-provider LLM hooks#342

Draft
Copilot wants to merge 5 commits intomainfrom
copilot/add-changelog-summary-action
Draft

Add release/summarize-changelog action for ref-range changelog compilation with template + LangChain multi-provider LLM hooks#342
Copilot wants to merge 5 commits intomainfrom
copilot/add-changelog-summary-action

Conversation

Copy link
Contributor

Copilot AI commented Feb 18, 2026

This introduces a new release action that compiles changelog content from commits between two refs. It supports conventional commit grouping, customizable Markdown rendering, and provider-agnostic LLM integration.

  • New action: actions/release/summarize-changelog

    • Adds a composite action to build release notes from base-ref..head-ref using git log --no-merges --pretty=format:%s.
    • Exposes structured outputs:
      • changelog (fully rendered Markdown)
      • changes (compiled sections/lists)
      • commit-count
      • llm-prompt (prompt payload for any downstream LLM step)
  • Conventional commit support

    • Adds conventional-commits input (true by default).
    • When enabled, commit subjects are grouped into sections (Features, Bug fixes, Documentation, etc.) based on commit type.
    • Non-matching messages fall back to Other changes.
  • Template-driven Markdown output

    • Adds markdown-template input with a default template.
    • Supports placeholders:
      • {{base_ref}}
      • {{head_ref}}
      • {{commit_count}}
      • {{summary}}
      • {{changes}}
    • Adds optional llm-summary input for injecting externally generated summaries.
  • LLM prompt-based summary generation

    • Adds npm-package-based summary generation using LangChain.
    • Adds optional inputs:
      • llm-provider (default: openai)
      • llm-model
      • llm-api-key
      • llm-base-url (default: https://api.openai.com/v1, used for openai)
    • When llm-summary is empty and llm-model is provided, the action generates the summary from llm-prompt using LangChain.
    • Supports multiple providers: openai, anthropic, google-genai.
    • Keeps optional llm-summary-command support as a fallback integration path.
  • Docs/catalog update

    • Adds action README with usage, inputs, and outputs.
    • Adds the new action to the root README.md Release actions list.
- id: changelog
  uses: hoverkraft-tech/ci-github-publish/actions/release/summarize-changelog@main
  with:
    base-ref: v1.2.0
    head-ref: HEAD
    conventional-commits: "true"
    llm-summary: ""
    llm-provider: "openai"
    llm-model: "gpt-4o-mini"
    llm-api-key: ${{ secrets.OPENAI_API_KEY }}
    llm-base-url: "https://api.openai.com/v1"
    llm-summary-command: ""
    markdown-template: |
      ## Release notes
      Range: `{{base_ref}}..{{head_ref}}`

      {{summary}}

      {{changes}}

💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

Co-authored-by: neilime <314088+neilime@users.noreply.github.com>
Copilot AI changed the title [WIP] Add action to summarize release changelog Add release/summarize-changelog action for ref-range changelog compilation with template + LLM hooks Feb 18, 2026
Copilot AI requested a review from neilime February 18, 2026 08:49
Copy link
Member

@neilime neilime left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot I want this action to use llm prompt to generate summary

Co-authored-by: neilime <314088+neilime@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Feb 18, 2026

@copilot I want this action to use llm prompt to generate summary

Implemented in 9ebc381: the action can now generate the summary from llm-prompt via a new llm-summary-command input (prompt passed through stdin, summary read from stdout), while still supporting manual llm-summary override. No UI changes in this update.

Copilot AI requested a review from neilime February 18, 2026 09:04
Copy link
Member

@neilime neilime left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot we must use a npm package to help using LLM

Co-authored-by: neilime <314088+neilime@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Feb 18, 2026

@copilot we must use a npm package to help using LLM

Implemented in bdfa639: summary generation now uses the npm openai package from llm-prompt when llm-model is set, with llm-api-key and OpenAI-compatible llm-base-url inputs. No UI changes in this update.

Copilot AI changed the title Add release/summarize-changelog action for ref-range changelog compilation with template + LLM hooks Add release/summarize-changelog action for ref-range changelog compilation with template + npm-based LLM hooks Feb 18, 2026
Copilot AI requested a review from neilime February 18, 2026 09:33
Copy link
Member

@neilime neilime left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot is it possible to use https://docs.langchain.com/oss/javascript/langchain/overview to handle LLM requests?

Copy link
Contributor Author

Copilot AI commented Feb 18, 2026

@copilot is it possible to use https://docs.langchain.com/oss/javascript/langchain/overview to handle LLM requests?

Yes, it’s possible. We can use LangChain JS (for example @langchain/openai with ChatOpenAI) to handle the llm-prompt request flow while keeping the current inputs (llm-model, llm-api-key, llm-base-url). If you want, I can switch the current implementation to LangChain in a follow-up commit. No UI changes in this reply.

Copilot AI requested a review from neilime February 18, 2026 09:36
@neilime
Copy link
Member

neilime commented Feb 18, 2026

@copilot is it possible to use https://docs.langchain.com/oss/javascript/langchain/overview to handle LLM requests?

Yes, it’s possible. We can use LangChain JS (for example @langchain/openai with ChatOpenAI) to handle the llm-prompt request flow while keeping the current inputs (llm-model, llm-api-key, llm-base-url). If you want, I can switch the current implementation to LangChain in a follow-up commit. No UI changes in this reply.

@copilot use langchain, we shoud be able to use other providiver thanopenai too

Co-authored-by: neilime <314088+neilime@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Feb 18, 2026

@copilot is it possible to use https://docs.langchain.com/oss/javascript/langchain/overview to handle LLM requests?

Yes, it’s possible. We can use LangChain JS (for example `@langchain/op...

Implemented in dd1ff6a: LLM summary generation now uses LangChain, with llm-provider support for openai, anthropic, and google-genai (not only OpenAI). It still uses llm-prompt and keeps llm-summary / llm-summary-command fallback behavior. No UI changes in this update.

Copilot AI changed the title Add release/summarize-changelog action for ref-range changelog compilation with template + npm-based LLM hooks Add release/summarize-changelog action for ref-range changelog compilation with template + LangChain multi-provider LLM hooks Feb 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

Comments