This guide explains how to set up the development environment and extend InfiniMetrics by adding new adapters and metrics.
This project uses pre-commit to automatically enforce code style (Black) and linting (Flake8) before every commit.
pip install pre-commitRun the following command in the project root directory to activate the git hooks:
pre-commit installOutput should be: pre-commit installed at .git/hooks/pre-commit
- Automatic: Just run
git commitas usual.- If Black modifies your files: Run
git add .again and re-commit. - If Flake8 reports errors: Fix the errors,
git add, and re-commit.
- If Black modifies your files: Run
- Manual: To check all files in the repository without committing:
pre-commit run --all-files
If you need to bypass the checks for a specific commit:
git commit -m "your message" --no-verifyCreate a new adapter class by inheriting from BaseAdapter:
from infinimetrics.adapter import BaseAdapter
class MyCustomAdapter(BaseAdapter):
def __init__(self, config):
super().__init__(config)
# Initialize your adapter
self.device = config.get('device', 'nvidia')
def setup(self):
# Prepare test environment
print(f"Setting up {self.__class__.__name__}")
# Load models, allocate memory, etc.
def process(self, test_input):
# Execute test and return metrics
results = {
"my.metric": {
"value": 42.0,
"unit": "operations/s"
}
}
return results
def teardown(self):
# Cleanup resources
print(f"Tearing down {self.__class__.__name__}")
# Free memory, close connections, etc.Add your adapter to the adapter registry in dispatcher.py:
# In dispatcher.py
self.adapter_registry = {
("operator", "myframework"): MyCustomAdapter,
# ... existing adapters ...
}Create a JSON configuration file:
{
"run_id": "my_test",
"testcase": "operator.myframework.MyTest",
"config": {
"device": "nvidia",
"iterations": 100
},
"metrics": [
{"name": "my.metric"}
]
}python main.py my_test_config.jsonIn infinimetrics/common/metrics.py:
class CustomMetric(Metric):
def __init__(self, name: str, value: float, unit: str = ""):
super().__init__(name, value, unit)
def to_dict(self):
return {
"name": self.name,
"value": self.value,
"unit": self.unit,
"timestamp": self.timestamp
}In your adapter's process method:
def process(self, test_input):
metric = CustomMetric("custom.metric", 123.45, "ms")
return {"custom.metric": metric.to_dict()}| Method | Description | Required |
|---|---|---|
__init__(config) |
Initialize adapter with configuration | Yes |
setup() |
Prepare test environment | Yes |
process(test_input) |
Execute test and return metrics | Yes |
teardown() |
Cleanup resources | Yes |
{
"run_id": "unique_identifier",
"testcase": "category.framework.test_name",
"config": {...},
"metrics": [...]
}{
"metric.name": {
"value": 42.0,
"unit": "unit_name"
}
}infinimetrics/
├── hardware/ # Hardware test adapters
├── operators/ # Operator test adapters
├── inference/ # Inference test adapters
├── communication/ # Communication test adapters
└── common/ # Shared utilities
- Adapter files:
{framework}_adapter.py - Adapter classes:
{Framework}Adapter(e.g.,InfiniCoreAdapter) - Test cases:
<category>.<framework>.<TestName>
- Error Handling: Always wrap critical operations in try-except blocks
- Logging: Use Python's logging module for debug output
- Resource Management: Ensure
teardown()properly releases all resources - Configuration: Provide sensible defaults for all config parameters
- Documentation: Add docstrings to all public methods
- Create a test configuration file
- Run with
--verboseflag for detailed output - Check output directory for metrics.json
- Verify logs for any errors
python main.py test_config.json --verboseWhen contributing adapters or metrics:
- Follow existing code style
- Add documentation for new features
- Include example configurations
- Update relevant documentation files
For questions or discussions, please open an issue on GitHub.