Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Dec 3, 2025

Why make this change?

DAB does not currently support response compression (gzip/brotli), which can reduce payload sizes by ~79% for large JSON responses. This is especially beneficial for REST and GraphQL endpoints over slow or high-latency networks.

What is this change?

Adds response compression middleware with configurable compression levels.

Configuration Model:

  • CompressionLevel enum: optimal, fastest, none
  • CompressionOptions record with JSON converter and validation
  • Default: optimal

Runtime Behavior:

  • Gzip and Brotli providers enabled for HTTPS
  • Applies to REST, GraphQL, and MCP endpoints
  • Logs compression level on startup (silent when none)
  • Properly handles unknown properties in JSON configuration

Schema & CLI:

{
  "runtime": {
    "compression": {
      "level": "optimal"
    }
  }
}

How was this tested?

  • Integration Tests
  • Unit Tests

Unit Tests:

  • Added 3 tests in ConfigureOptionsTests.cs covering all compression levels (fastest, optimal, none)

Integration Tests:

  • Added 7 tests in CompressionIntegrationTests.cs validating:
    • Gzip compression works when Accept-Encoding header is present
    • Brotli compression works
    • Compression reduces payload size by at least 10%
    • Compression is disabled when level is None
    • No compression without Accept-Encoding header
    • Fastest compression level works correctly
    • Compressed content can be decompressed and is valid JSON

All tests pass successfully and verify that compression reduces payload sizes without breaking existing functionality.

Sample Request(s)

CLI Configuration:

dab configure --runtime.compression.level fastest

Sample Request/Response:

GET /api/employees HTTP/1.1
Accept-Encoding: gzip, br

HTTP/1.1 200 OK
Content-Encoding: gzip
Content-Type: application/json

Startup Output (when enabled):

Response compression enabled with level 'Optimal' for REST, GraphQL, and MCP endpoints.

💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.

Copilot AI and others added 2 commits December 3, 2025 21:33
Co-authored-by: JerryNixon <1749983+JerryNixon@users.noreply.github.com>
Co-authored-by: JerryNixon <1749983+JerryNixon@users.noreply.github.com>
Copilot AI changed the title [WIP] Add support for response compression in Data API Add HTTP Response Compression Support Dec 3, 2025
Copilot AI requested a review from JerryNixon December 3, 2025 21:45
@JerryNixon JerryNixon marked this pull request as ready for review December 4, 2025 03:05
Copilot AI review requested due to automatic review settings December 4, 2025 03:05
@JerryNixon JerryNixon enabled auto-merge (squash) December 4, 2025 03:05
@JerryNixon JerryNixon moved this from Todo to Review In Progress in Data API builder Dec 4, 2025
@JerryNixon JerryNixon added this to the Dec 2025 milestone Dec 4, 2025
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds HTTP response compression support to Data API Builder, enabling Gzip and Brotli compression for REST, GraphQL, and MCP endpoints to reduce payload sizes and improve performance over slow networks.

Key Changes:

  • Added configurable compression middleware with three levels: optimal (default), fastest, and none
  • Implemented JSON schema, CLI commands, and configuration model for compression settings
  • Added unit tests for CLI configuration updates

Reviewed changes

Copilot reviewed 10 out of 10 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
src/Service/Startup.cs Adds ConfigureResponseCompression method to register Gzip and Brotli providers, and calls UseResponseCompression middleware when compression is enabled
src/Config/ObjectModel/CompressionOptions.cs Defines the CompressionOptions record with Level property and UserProvidedLevel flag for JSON serialization control
src/Config/ObjectModel/CompressionLevel.cs Defines the CompressionLevel enum with Optimal, Fastest, and None values
src/Config/Converters/CompressionOptionsConverterFactory.cs Implements custom JSON converter for reading and writing compression configuration with validation
src/Config/RuntimeConfigLoader.cs Registers the CompressionOptionsConverterFactory in the serialization options
src/Config/ObjectModel/RuntimeOptions.cs Adds Compression property to runtime configuration options
src/Cli/ConfigGenerator.cs Implements CLI configuration update logic for compression level settings
src/Cli/Commands/ConfigureOptions.cs Adds RuntimeCompressionLevel command-line option for the dab configure command
src/Cli.Tests/ConfigureOptionsTests.cs Adds test cases for all three compression levels (Optimal, Fastest, None)
schemas/dab.draft.schema.json Adds JSON schema definition for the compression configuration section

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

}
}
}
}
Copy link

Copilot AI Dec 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The converter doesn't properly handle unknown properties with complex values (objects or arrays). After reading a property name on line 58, if the property is not "level", the code continues the loop without skipping the property value. This works for simple values but could cause parsing issues if an unknown property contains an object or array.

Add a default case to properly skip unknown properties:

if (string.Equals(propertyName, "level", StringComparison.OrdinalIgnoreCase))
{
    string? levelStr = reader.GetString();
    if (levelStr is not null)
    {
        if (Enum.TryParse<CompressionLevel>(levelStr, ignoreCase: true, out CompressionLevel parsedLevel))
        {
            level = parsedLevel;
            userProvidedLevel = true;
        }
        else
        {
            throw new JsonException($"Invalid compression level: '{levelStr}'. Valid values are: optimal, fastest, none.");
        }
    }
}
else
{
    // Skip unknown properties
    reader.Skip();
}
Suggested change
}
else
{
// Skip unknown properties and their values (including objects/arrays)
reader.Skip();
}

Copilot uses AI. Check for mistakes.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added the else block to properly skip unknown properties including objects and arrays using reader.Skip(). Fixed in commit c53261c.

/// <summary>
/// Tests that running "dab configure --runtime.compression.level {value}" on a config with various values results
/// in runtime config update. Takes in updated value for compression.level and
/// validates whether the runtime config reflects those updated values
Copy link

Copilot AI Dec 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing closing period in the documentation comment. The summary should end with a period for consistency with other test documentation in the file.

Suggested change
/// validates whether the runtime config reflects those updated values
/// validates whether the runtime config reflects those updated values.

Copilot uses AI. Check for mistakes.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added the missing period to the documentation comment. Fixed in commit c53261c.

Comment on lines +470 to +505
private void ConfigureResponseCompression(IServiceCollection services, RuntimeConfig? runtimeConfig)
{
CompressionLevel compressionLevel = runtimeConfig?.Runtime?.Compression?.Level ?? CompressionOptions.DEFAULT_LEVEL;

// Only configure compression if level is not None
if (compressionLevel == CompressionLevel.None)
{
return;
}

System.IO.Compression.CompressionLevel systemCompressionLevel = compressionLevel switch
{
CompressionLevel.Fastest => System.IO.Compression.CompressionLevel.Fastest,
CompressionLevel.Optimal => System.IO.Compression.CompressionLevel.Optimal,
_ => System.IO.Compression.CompressionLevel.Optimal
};

services.AddResponseCompression(options =>
{
options.EnableForHttps = true;
options.Providers.Add<Microsoft.AspNetCore.ResponseCompression.GzipCompressionProvider>();
options.Providers.Add<Microsoft.AspNetCore.ResponseCompression.BrotliCompressionProvider>();
});

services.Configure<Microsoft.AspNetCore.ResponseCompression.GzipCompressionProviderOptions>(options =>
{
options.Level = systemCompressionLevel;
});

services.Configure<Microsoft.AspNetCore.ResponseCompression.BrotliCompressionProviderOptions>(options =>
{
options.Level = systemCompressionLevel;
});

_logger.LogInformation("Response compression enabled with level '{compressionLevel}' for REST, GraphQL, and MCP endpoints.", compressionLevel);
}
Copy link

Copilot AI Dec 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The response compression middleware lacks integration test coverage. While unit tests verify the CLI configuration, there are no tests validating that:

  1. The compression middleware is actually registered when configured
  2. Responses are compressed when clients send Accept-Encoding headers
  3. Compression is disabled when the level is set to None
  4. Different compression levels (Optimal vs Fastest) behave as expected

Consider adding integration tests similar to CorrelationIdTests.cs that verify the middleware behavior with actual HTTP requests and responses.

Copilot uses AI. Check for mistakes.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added 7 integration tests in CompressionIntegrationTests.cs that verify:

  1. Gzip compression works when Accept-Encoding header is present
  2. Brotli compression works
  3. Compression reduces payload size significantly (10%+ reduction)
  4. Compression is disabled when level is None
  5. No compression without Accept-Encoding header
  6. Fastest compression level works correctly
  7. Compressed content can be decompressed and is valid JSON

All tests pass. Fixed in commit c53261c.

@aaronburtle aaronburtle self-assigned this Dec 15, 2025
@anushakolan
Copy link
Contributor

Can we add an integration test to verify if this compression is actually reducing the payload sizes and it is not breaking the existing code?

Copy link
Contributor

@anushakolan anushakolan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, but waiting on the integration test to be added.

}
catch (Exception ex)
{
_logger.LogError("Failed to update RuntimeConfig.Compression with exception message: {exceptionMessage}.", ex.Message);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
_logger.LogError("Failed to update RuntimeConfig.Compression with exception message: {exceptionMessage}.", ex.Message);
_logger.LogError("Failed to configure RuntimeConfig.Compression with exception message: {exceptionMessage}.", ex.Message);

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated error message to use "configure" instead of "update" for consistency. Fixed in commit c53261c.

@anushakolan
Copy link
Contributor

@copilot Please address this, Can we add an integration test to verify if this compression is actually reducing the payload sizes and it is not breaking the existing code?

…ing, and add compression integration tests

Co-authored-by: anushakolan <45540936+anushakolan@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Jan 16, 2026

@copilot Please address this, Can we add an integration test to verify if this compression is actually reducing the payload sizes and it is not breaking the existing code?

Added 7 integration tests that verify compression reduces payload sizes and doesn't break existing functionality. The tests validate:

  • Compression works with both Gzip and Brotli
  • Payload size reduction is measured and verified (compression achieves 10%+ reduction)
  • Compression can be disabled (level=None)
  • Content can be decompressed correctly
  • Different compression levels work as expected

All tests pass successfully. The integration test that measures payload reduction shows actual compression ratios in the console output. Fixed in commit c53261c.

Warning

Firewall rules blocked me from connecting to one or more addresses (expand for details)

I tried to connect to the following addresses, but was blocked by firewall rules:

  • http://crl3.digicert.com:443/DigiCertTrustedG4RSA4096SHA256TimeStampingCA.crl
    • Triggering command: /usr/bin/dotnet dotnet build Azure.DataApiBuilder.sln --verbosity minimal (http block)
  • http://crl3.digicert.com:443/sha2-assured-cs-g1.crl
    • Triggering command: /usr/bin/dotnet dotnet build Azure.DataApiBuilder.sln --verbosity minimal (http block)

If you need me to access, download, or install something from one of these locations, you can either:

Copilot AI and others added 2 commits January 16, 2026 06:14
Co-authored-by: anushakolan <45540936+anushakolan@users.noreply.github.com>
Co-authored-by: anushakolan <45540936+anushakolan@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

Status: Review In Progress

Development

Successfully merging this pull request may close these issues.

5 participants