Conversation
Co-authored-by: neilime <314088+neilime@users.noreply.github.com>
release/summarize-changelog action for ref-range changelog compilation with template + LLM hooks
Co-authored-by: neilime <314088+neilime@users.noreply.github.com>
Implemented in |
Co-authored-by: neilime <314088+neilime@users.noreply.github.com>
Implemented in |
release/summarize-changelog action for ref-range changelog compilation with template + LLM hooksrelease/summarize-changelog action for ref-range changelog compilation with template + npm-based LLM hooks
neilime
left a comment
There was a problem hiding this comment.
@copilot is it possible to use https://docs.langchain.com/oss/javascript/langchain/overview to handle LLM requests?
Yes, it’s possible. We can use LangChain JS (for example |
@copilot use langchain, we shoud be able to use other providiver thanopenai too |
Co-authored-by: neilime <314088+neilime@users.noreply.github.com>
Implemented in |
release/summarize-changelog action for ref-range changelog compilation with template + npm-based LLM hooksrelease/summarize-changelog action for ref-range changelog compilation with template + LangChain multi-provider LLM hooks
This introduces a new release action that compiles changelog content from commits between two refs. It supports conventional commit grouping, customizable Markdown rendering, and provider-agnostic LLM integration.
New action:
actions/release/summarize-changelogbase-ref..head-refusinggit log --no-merges --pretty=format:%s.changelog(fully rendered Markdown)changes(compiled sections/lists)commit-countllm-prompt(prompt payload for any downstream LLM step)Conventional commit support
conventional-commitsinput (trueby default).Features,Bug fixes,Documentation, etc.) based on commit type.Other changes.Template-driven Markdown output
markdown-templateinput with a default template.{{base_ref}}{{head_ref}}{{commit_count}}{{summary}}{{changes}}llm-summaryinput for injecting externally generated summaries.LLM prompt-based summary generation
llm-provider(default:openai)llm-modelllm-api-keyllm-base-url(default:https://api.openai.com/v1, used foropenai)llm-summaryis empty andllm-modelis provided, the action generates the summary fromllm-promptusing LangChain.openai,anthropic,google-genai.llm-summary-commandsupport as a fallback integration path.Docs/catalog update
README.mdRelease actions list.💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.