Skip to content

Nightly report: post details as follow-up comments instead of truncating#1239

Merged
aaronpowell merged 1 commit intogithub:stagedfrom
JanKrivanek:dev/jankrivanek/split-nightly-report
Mar 31, 2026
Merged

Nightly report: post details as follow-up comments instead of truncating#1239
aaronpowell merged 1 commit intogithub:stagedfrom
JanKrivanek:dev/jankrivanek/split-nightly-report

Conversation

@JanKrivanek
Copy link
Copy Markdown
Contributor

Context

When the full report exceeds GitHub's 65K body limit, the summary table stays in the discussion/issue body and the verbose skill/agent output is posted as follow-up comments (split into chunks if needed). This ensures no output is lost.

e.g. #1235

When the full report exceeds GitHub's 65K body limit, the summary
table stays in the discussion/issue body and the verbose skill/agent
output is posted as follow-up comments (split into chunks if needed).
This ensures no output is lost.
Copilot AI review requested due to automatic review settings March 31, 2026 06:01
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adjusts the nightly Skill Quality Report workflow to avoid losing output when the GitHub Discussion/Issue body exceeds the ~65KB limit by keeping a compact summary in the main post and publishing verbose sections as follow-up comments (chunked if necessary).

Changes:

  • Adds byte-size aware logic to decide whether to inline the full report or move verbose sections into comment “parts”.
  • Updates the Discussion creation step to post any overflow parts as Discussion comments via GraphQL.
  • Updates the Issue fallback path to post overflow parts as Issue comments via gh issue comment.
Comments suppressed due to low confidence (1)

.github/workflows/skill-quality-report.yml:362

  • The discussion comment loop assumes every report-comment-${i}.md exists and will throw on readFileSync if any file is missing/corrupt, which would fail the step and trigger the issue fallback. Add an existence check (similar to the issue fallback loop) and decide whether missing parts should be skipped or treated as a hard failure.
            // Post overflow detail comments
            for (let i = 0; i < commentCount; i++) {
              const commentBody = fs.readFileSync(`report-comment-${i}.md`, 'utf8');
              await github.graphql(`
                mutation($discussionId: ID!, $body: String!) {

Comment on lines +259 to +277
function chunkContent(label, content) {
const prefix = `## ${label}\n\n\`\`\`\n`;
const suffix = '\n```';
const overhead = Buffer.byteLength(prefix + suffix, 'utf8');
const budget = MAX_BYTES - overhead;

const buf = Buffer.from(content, 'utf8');
if (buf.length <= budget) {
return [prefix + content + suffix];
}
);
}
const parts = [];
let offset = 0;
let partNum = 1;
while (offset < buf.length) {
const slice = buf.slice(offset, offset + budget).toString('utf8');
// Remove trailing replacement char from mid-codepoint cut
const clean = slice.replace(/\uFFFD$/, '');
const hdr = `## ${label} (part ${partNum})\n\n\`\`\`\n`;
parts.push(hdr + clean + suffix);
Copy link

Copilot AI Mar 31, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

chunkContent() computes budget using prefix/suffix overhead, but when splitting it actually uses a larger header (hdr includes (part N)). That means each chunk can exceed MAX_BYTES by the extra header bytes, risking GraphQL failures due to comment body size. Compute the per-part budget using the actual hdr + suffix byte length (or reserve for the worst-case header size) before slicing.

This issue also appears on line 358 of the same file.

Copilot uses AI. Check for mistakes.
Comment on lines +359 to +371
for (let i = 0; i < commentCount; i++) {
const commentBody = fs.readFileSync(`report-comment-${i}.md`, 'utf8');
await github.graphql(`
mutation($discussionId: ID!, $body: String!) {
addDiscussionComment(input: {
discussionId: $discussionId,
body: $body
}) {
comment { id }
}
}
`, { discussionId, body: commentBody });
console.log(`Posted detail comment ${i + 1}/${commentCount}`);
Copy link

Copilot AI Mar 31, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The step is continue-on-error: true and the script posts follow-up comments after creating the discussion. If any addDiscussionComment call fails (size/rate limits/transient error), the step will fail and the fallback issue step will run even though the discussion was already created, leading to duplicate report posts. Consider catching errors during comment posting (and logging) so the step can still succeed once the discussion exists, or otherwise gate the fallback on whether the discussion was created.

Suggested change
for (let i = 0; i < commentCount; i++) {
const commentBody = fs.readFileSync(`report-comment-${i}.md`, 'utf8');
await github.graphql(`
mutation($discussionId: ID!, $body: String!) {
addDiscussionComment(input: {
discussionId: $discussionId,
body: $body
}) {
comment { id }
}
}
`, { discussionId, body: commentBody });
console.log(`Posted detail comment ${i + 1}/${commentCount}`);
try {
for (let i = 0; i < commentCount; i++) {
const commentBody = fs.readFileSync(`report-comment-${i}.md`, 'utf8');
await github.graphql(`
mutation($discussionId: ID!, $body: String!) {
addDiscussionComment(input: {
discussionId: $discussionId,
body: $body
}) {
comment { id }
}
}
`, { discussionId, body: commentBody });
console.log(`Posted detail comment ${i + 1}/${commentCount}`);
}
} catch (error) {
core.warning(`One or more discussion comments failed to post: ${error.message || error}`);
console.error(error);

Copilot uses AI. Check for mistakes.
@aaronpowell aaronpowell merged commit 35d73e0 into github:staged Mar 31, 2026
11 of 12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants