Skip to content

Comments

feat: basic definition of metrics#175

Merged
Seddryck merged 3 commits intomainfrom
feat/metrics
May 9, 2025
Merged

feat: basic definition of metrics#175
Seddryck merged 3 commits intomainfrom
feat/metrics

Conversation

@Seddryck
Copy link
Owner

@Seddryck Seddryck commented May 9, 2025

Summary by CodeRabbit

  • New Features
    • Introduced support for defining metrics in data package schemas, including a new Metric class and a Metrics property on schemas.
    • Added example data packages in both JSON and YAML formats that include metric definitions.
  • Tests
    • Added tests to validate deserialization of data packages with metrics in both JSON and YAML formats.
  • Documentation
    • Improved XML documentation formatting for field properties.
  • Chores
    • Updated project configuration to manage embedded resources and exclude certain files from compilation.
    • Refactored schema URL handling in a script for improved maintainability.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented May 9, 2025

Walkthrough

This update introduces a Metric class to the core model and extends the Schema class with a Metrics property. New test resources in JSON and YAML formats are added, along with tests to validate metric deserialization. Project configuration is updated to embed these new resources and exclude certain files. Minor documentation comment corrections and a script refactor are also included.

Changes

File(s) Change Summary
src/Packata.Core/Metric.cs New Metric class introduced, encapsulating metric properties such as name, type, aggregation, and expression.
src/Packata.Core/Schema.cs Added nullable Metrics property of type List<Metric>? to the Schema class.
src/Packata.Core/Field.cs Fixed XML documentation comments by removing extraneous leading quotes.
src/Packata.Core.Testing/Serialization/ExtensionSerializerTests.cs Added new test class for deserializing and validating metrics from embedded JSON/YAML resources.
src/Packata.Core.Testing/Serialization/Json/Resources/extension.json,
src/Packata.Core.Testing/Serialization/Yaml/Resources/extension.yaml
Added new data package resource files with metric definitions for testing, in both JSON and YAML formats.
src/Packata.Core.Testing/Packata.Core.Testing.csproj Updated project file to embed new resources and exclude PathHandling directory from compilation.
package-delta-schemas.ps1 Refactored schema URL handling to use a base URL and simplified schema array entries.

Sequence Diagram(s)

sequenceDiagram
    participant TestRunner as Test Runner
    participant ExtensionSerializerTests as ExtensionSerializerTests
    participant ResourceStream as Embedded Resource Stream
    participant Serializer as DataPackageSerializer
    participant DataPackage as DataPackage

    TestRunner->>ExtensionSerializerTests: Run Deserialize_Package_Success/Deserialize_Metrics_Success
    ExtensionSerializerTests->>ResourceStream: Retrieve embedded JSON/YAML resource
    ExtensionSerializerTests->>Serializer: Instantiate serializer (JSON/YAML)
    ExtensionSerializerTests->>Serializer: Deserialize resource stream
    Serializer->>DataPackage: Produce DataPackage object
    ExtensionSerializerTests->>DataPackage: Assert package and metrics properties
Loading

Possibly related PRs

Suggested labels

new-feature

Poem

In fields of code where metrics bloom,
A rabbit hops from room to room.
With schemas new and tests in tow,
JSON and YAML both now show
How measures, metrics, all align—
In Packata’s core, the numbers shine!
🐇✨

Tip

⚡️ Faster reviews with caching
  • CodeRabbit now supports caching for code and dependencies, helping speed up reviews. This means quicker feedback, reduced wait times, and a smoother review experience overall. Cached data is encrypted and stored securely. This feature will be automatically enabled for all accounts on May 16th. To opt out, configure Review - Disable Cache at either the organization or repository level. If you prefer to disable all data retention across your organization, simply turn off the Data Retention setting under your Organization Settings.

Enjoy the performance boost—your workflow just got faster.

✨ Finishing Touches
  • 📝 Generate Docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@Seddryck Seddryck changed the title Feat/metrics feat: basic definition of metrics May 9, 2025
@sonarqubecloud
Copy link

sonarqubecloud bot commented May 9, 2025

@codecov-commenter
Copy link

codecov-commenter commented May 9, 2025

Codecov Report

Attention: Patch coverage is 87.50000% with 1 line in your changes missing coverage. Please review.

Project coverage is 76.93%. Comparing base (f75fe9d) to head (6e36af7).
Report is 1 commits behind head on main.

Files with missing lines Patch % Lines
src/Packata.Core/Metric.cs 85.71% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #175      +/-   ##
==========================================
+ Coverage   76.88%   76.93%   +0.04%     
==========================================
  Files         105      106       +1     
  Lines        1787     1795       +8     
  Branches      283      283              
==========================================
+ Hits         1374     1381       +7     
- Misses        318      319       +1     
  Partials       95       95              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

🧹 Nitpick comments (6)
src/Packata.Core/Metric.cs (1)

33-38: Consider adding validation for aggregation functions.

The Aggregation property could benefit from validation constraints or enumeration of allowed values (SUM, MAX, COUNT, etc.) to ensure only supported aggregation functions are used.

You could implement this by using an enum instead of a string:

-    public string? Aggregation { get; set; }
+    public AggregationType? Aggregation { get; set; }

And define the enum:

public enum AggregationType
{
    SUM,
    MAX,
    MIN,
    COUNT,
    AVG
    // Add other supported aggregation types
}
src/Packata.Core/Schema.cs (1)

52-52: Consider initializing the Metrics property with an empty list.

For consistency with the Fields property (line 13), consider initializing the Metrics property with an empty list instead of leaving it as null by default. This would provide a more consistent API experience.

-    public List<Metric>? Metrics { get; set; }
+    public List<Metric> Metrics { get; set; } = [];
src/Packata.Core.Testing/Serialization/Yaml/Resources/extension.yaml (2)

5-5: Creation date is set in the future.

The creation date is set to "2025-05-04T12:45:21Z", which is in the future. While this may be intentional for testing purposes, it's worth noting that this could cause issues if temporal validation is implemented.


32-43: Good implementation of metrics array.

The metrics array is well-structured with appropriate properties for name, type, title, description, aggregation, and expression. This aligns well with the feature goal of defining basic metrics.

However, I notice that the second metric max_temperature lacks a description property while the first one has it. Consider adding a description for consistency.

     - name: max_temperature
       type: numeric
       title: Maximum temperature
+      description: Maximum recorded temperature of the sensor
       aggregation: max
       expression: temperature
src/Packata.Core.Testing/Serialization/Json/Resources/extension.json (1)

48-64: Well-structured metrics definition.

The metrics array is properly defined with appropriate properties, matching the structure in the YAML file. Good job on keeping the definitions consistent between formats.

Similar to the YAML file, the second metric max_temperature is missing a description property while the first one has it. Consider adding a description for consistency.

                    {
                        "name": "max_temperature",
                        "type": "numeric",
                        "title": "Maximum temperature",
+                        "description": "Maximum recorded temperature of the sensor",
                        "aggregation": "max",
                        "expression": "temperature"
                    }
src/Packata.Core.Testing/Serialization/ExtensionSerializerTests.cs (1)

20-28: Resource stream retrieval could be more robust.

The method to retrieve resource streams is well-structured, but could be improved for maintainability.

    private static Stream GetDataPackageProperties(string format)
    {
-        var uformat = format.ToUpper()[0] + format.Substring(1);
+        var uformat = char.ToUpper(format[0]) + format[1..];
        var assembly = Assembly.GetExecutingAssembly();
        var resourceName = $"{assembly.GetName().Name}.Serialization.{uformat}.Resources.extension.{format}";
-        var stream = assembly.GetManifestResourceStream(resourceName)
-            ?? throw new FileNotFoundException($"The embedded file {resourceName} doesn't exist.");
+        var stream = assembly.GetManifestResourceStream(resourceName);
+        if (stream == null)
+        {
+            // For debugging purposes, list available resources
+            var resources = assembly.GetManifestResourceNames();
+            var availableResources = string.Join(", ", resources);
+            throw new FileNotFoundException($"The embedded file {resourceName} doesn't exist. Available resources: {availableResources}");
+        }
        return stream;
    }
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f75fe9d and 6e36af7.

📒 Files selected for processing (8)
  • package-delta-schemas.ps1 (2 hunks)
  • src/Packata.Core.Testing/Packata.Core.Testing.csproj (1 hunks)
  • src/Packata.Core.Testing/Serialization/ExtensionSerializerTests.cs (1 hunks)
  • src/Packata.Core.Testing/Serialization/Json/Resources/extension.json (1 hunks)
  • src/Packata.Core.Testing/Serialization/Yaml/Resources/extension.yaml (1 hunks)
  • src/Packata.Core/Field.cs (2 hunks)
  • src/Packata.Core/Metric.cs (1 hunks)
  • src/Packata.Core/Schema.cs (1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
src/Packata.Core/Schema.cs (1)
src/Packata.Core/Metric.cs (1)
  • Metric (8-44)
🔇 Additional comments (8)
src/Packata.Core/Metric.cs (1)

1-44: New Metric class implementation looks good.

The Metric class is well-structured with properly documented properties that capture all necessary aspects of metrics: name, type, title, description, aggregation function, expression, and dimensions.

src/Packata.Core/Field.cs (3)

16-16: Documentation fix for Type property.

Removed extraneous quotes from the documentation comment.


26-26: Documentation fix for Title property.

Removed extraneous quotes from the documentation comment.


31-31: Documentation fix for Description property.

Removed extraneous quotes from the documentation comment.

src/Packata.Core/Schema.cs (1)

52-52: Added Metrics property to Schema class.

The new nullable Metrics property allows schemas to include metric definitions, complementing the new Metric class.

src/Packata.Core.Testing/Packata.Core.Testing.csproj (2)

2-6: Explicit exclusion of PathHandling directory.

You're now explicitly excluding the PathHandling directory from compilation, embedded resources, and None item groups, which helps maintain a clean project structure.


9-10: Added new test resources for metrics functionality.

New embedded resources extension.json and extension.yaml have been added to support testing the new metrics functionality.

Also applies to: 14-16

package-delta-schemas.ps1 (1)

3-8: Refactored URL handling with a centralized base URL.

Good refactoring to extract the common base URL into a variable, which makes the code more maintainable.

try {
Write-Host "Generating schema for $($schema.Class)..."
schemathief delta -a $assemblyPath -c $schema.Class -b $schema.Url -x "paths|profile" -o $schema.Output
schemathief delta -a $assemblyPath -c $schema.Class -b ($schema.Url + $schema.Value) -x "paths|profile" -o ($dir + $schema.Value)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Bug in schema generation command.

The script tries to use $schema.Url in the command, but there is no 'Url' property defined in the schema entries.

-        schemathief delta -a $assemblyPath -c $schema.Class -b ($schema.Url + $schema.Value) -x "paths|profile" -o ($dir + $schema.Value)
+        schemathief delta -a $assemblyPath -c $schema.Class -b ($baseUrl + $schema.Value) -x "paths|profile" -o ($dir + $schema.Value)
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
schemathief delta -a $assemblyPath -c $schema.Class -b ($schema.Url + $schema.Value) -x "paths|profile" -o ($dir + $schema.Value)
schemathief delta -a $assemblyPath -c $schema.Class -b ($baseUrl + $schema.Value) -x "paths|profile" -o ($dir + $schema.Value)

@@ -0,0 +1,44 @@
"$schema": https://packata.dev/profiles/2.0/datapackage.json
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Schema version inconsistency detected.

The datapackage schema reference uses version 2.0 (https://packata.dev/profiles/2.0/datapackage.json), while the dataresource (line 13) and tableschema (line 44) both use version 1.0. This inconsistency should be resolved to maintain version compatibility.


🏁 Script executed:

#!/bin/bash
# Check for schema version inconsistencies in yaml files
echo "Checking schema versions in YAML files..."
find src -name "*.yaml" -exec grep -l "\$schema" {} \; | xargs cat | grep -o "profiles/[0-9]\.[0-9]"

Length of output: 340


🏁 Script executed:

#!/bin/bash
grep -R "profiles/1.0" -n src

Length of output: 1149


Align schema version references in YAML extension

There’s a mismatch in extension.yaml between the datapackage schema (2.0) and the resource/tableschema schemas (1.0). All schema URLs should use the same version.

Files to update:

  • src/Packata.Core.Testing/Serialization/Yaml/Resources/extension.yaml
    • Line 1: currently "https://packata.dev/profiles/2.0/datapackage.json"
    • Line 13: "https://packata.dev/profiles/1.0/dataresource.json"
    • Line 44: "https://packata.dev/profiles/1.0/tableschema.json"
  • For reference, JSON fixtures and code defaults use 1.0:
    • src/Packata.Core.Testing/Serialization/Json/Resources/extension.json (all schemas 1.0)
    • Packata.Core/DataPackage.cs, Resource.Props.cs, TableDialect.cs (default profiles 1.0)

Suggested fixes (pick one):

  • Downgrade the datapackage URL in extension.yaml to 1.0 to match the others.
  • Or, if you’re ready to adopt 2.0, bump the dataresource and tableschema URLs (and JSON fixtures/code defaults) to 2.0.

Please update the URLs for consistent schema versions.

@@ -0,0 +1,69 @@
{
"$schema": "https://packata.dev/profiles/1.0/datapackage.json",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Schema version inconsistency with YAML counterpart.

The JSON file uses version 1.0 for the datapackage schema, while the YAML counterpart uses version 2.0. This inconsistency should be resolved to maintain consistency between the two formats.


🏁 Script executed:

#!/bin/bash
# Compare schema versions between JSON and YAML files
echo "JSON schema version:"
grep -o "profiles/[0-9]\.[0-9]/datapackage.json" src/Packata.Core.Testing/Serialization/Json/Resources/extension.json

echo "YAML schema version:"
grep -o "profiles/[0-9]\.[0-9]/datapackage.json" src/Packata.Core.Testing/Serialization/Yaml/Resources/extension.yaml

Length of output: 401


Align JSON schema version with YAML (2.0)

The JSON datapackage schema in extension.json is still on version 1.0, while the YAML counterpart uses 2.0. To keep both formats in sync, please update the JSON file:

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"$schema": "https://packata.dev/profiles/1.0/datapackage.json",
"$schema": "https://packata.dev/profiles/2.0/datapackage.json",

using var streamReader = new StreamReader(value.Stream);
var dataPackage = value.Serializer.Deserialize(streamReader, new LocalDirectoryDataPackageContainer(), new StorageProvider());
Assert.That(dataPackage.Resources[0].Schema?.Metrics, Is.Not.Null);
var metrics = dataPackage.Resources[0].Schema?.Metrics;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add null check before accessing metrics collection.

While the code uses the null-conditional operator when retrieving the metrics, it doesn't ensure metrics is non-null before using it in assertions.

-        var metrics = dataPackage.Resources[0].Schema?.Metrics;
+        var metrics = dataPackage.Resources[0].Schema?.Metrics;
+        Assert.That(metrics, Is.Not.Null, "The metrics collection should not be null");
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
var metrics = dataPackage.Resources[0].Schema?.Metrics;
var metrics = dataPackage.Resources[0].Schema?.Metrics;
Assert.That(metrics, Is.Not.Null, "The metrics collection should not be null");

Comment on lines +56 to +73
[TestCaseSource(nameof(GetData))]
public void Deserialize_Metrics_Success((Stream Stream, IDataPackageSerializer Serializer) value)
{
using var streamReader = new StreamReader(value.Stream);
var dataPackage = value.Serializer.Deserialize(streamReader, new LocalDirectoryDataPackageContainer(), new StorageProvider());
Assert.That(dataPackage.Resources[0].Schema?.Metrics, Is.Not.Null);
var metrics = dataPackage.Resources[0].Schema?.Metrics;
Assert.Multiple(() =>
{
Assert.That(metrics, Has.Count.EqualTo(2));
Assert.That(metrics?[0].Name, Is.EqualTo("average_temperature"));
Assert.That(metrics?[0].Type, Is.EqualTo("numeric"));
Assert.That(metrics?[0].Title, Is.EqualTo("Average temperature"));
Assert.That(metrics?[0].Description, Is.EqualTo("Average temperature of the sensor"));
Assert.That(metrics?[0].Aggregation, Is.EqualTo("avg"));
Assert.That(metrics?[0].Expression, Is.EqualTo("temperature"));
});
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Enhance metric testing coverage.

The test verifies the first metric's properties thoroughly but doesn't check all properties of the second metric.

    [TestCaseSource(nameof(GetData))]
    public void Deserialize_Metrics_Success((Stream Stream, IDataPackageSerializer Serializer) value)
    {
        using var streamReader = new StreamReader(value.Stream);
        var dataPackage = value.Serializer.Deserialize(streamReader, new LocalDirectoryDataPackageContainer(), new StorageProvider());
        Assert.That(dataPackage.Resources[0].Schema?.Metrics, Is.Not.Null);
        var metrics = dataPackage.Resources[0].Schema?.Metrics;
        Assert.Multiple(() =>
        {
            Assert.That(metrics, Has.Count.EqualTo(2));
            
            // First metric checks
            Assert.That(metrics?[0].Name, Is.EqualTo("average_temperature"));
            Assert.That(metrics?[0].Type, Is.EqualTo("numeric"));
            Assert.That(metrics?[0].Title, Is.EqualTo("Average temperature"));
            Assert.That(metrics?[0].Description, Is.EqualTo("Average temperature of the sensor"));
            Assert.That(metrics?[0].Aggregation, Is.EqualTo("avg"));
            Assert.That(metrics?[0].Expression, Is.EqualTo("temperature"));
+            
+            // Second metric checks
+            Assert.That(metrics?[1].Name, Is.EqualTo("max_temperature"));
+            Assert.That(metrics?[1].Type, Is.EqualTo("numeric"));
+            Assert.That(metrics?[1].Title, Is.EqualTo("Maximum temperature"));
+            Assert.That(metrics?[1].Aggregation, Is.EqualTo("max"));
+            Assert.That(metrics?[1].Expression, Is.EqualTo("temperature"));
        });
    }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
[TestCaseSource(nameof(GetData))]
public void Deserialize_Metrics_Success((Stream Stream, IDataPackageSerializer Serializer) value)
{
using var streamReader = new StreamReader(value.Stream);
var dataPackage = value.Serializer.Deserialize(streamReader, new LocalDirectoryDataPackageContainer(), new StorageProvider());
Assert.That(dataPackage.Resources[0].Schema?.Metrics, Is.Not.Null);
var metrics = dataPackage.Resources[0].Schema?.Metrics;
Assert.Multiple(() =>
{
Assert.That(metrics, Has.Count.EqualTo(2));
Assert.That(metrics?[0].Name, Is.EqualTo("average_temperature"));
Assert.That(metrics?[0].Type, Is.EqualTo("numeric"));
Assert.That(metrics?[0].Title, Is.EqualTo("Average temperature"));
Assert.That(metrics?[0].Description, Is.EqualTo("Average temperature of the sensor"));
Assert.That(metrics?[0].Aggregation, Is.EqualTo("avg"));
Assert.That(metrics?[0].Expression, Is.EqualTo("temperature"));
});
}
[TestCaseSource(nameof(GetData))]
public void Deserialize_Metrics_Success((Stream Stream, IDataPackageSerializer Serializer) value)
{
using var streamReader = new StreamReader(value.Stream);
var dataPackage = value.Serializer.Deserialize(streamReader, new LocalDirectoryDataPackageContainer(), new StorageProvider());
Assert.That(dataPackage.Resources[0].Schema?.Metrics, Is.Not.Null);
var metrics = dataPackage.Resources[0].Schema?.Metrics;
Assert.Multiple(() =>
{
Assert.That(metrics, Has.Count.EqualTo(2));
// First metric checks
Assert.That(metrics?[0].Name, Is.EqualTo("average_temperature"));
Assert.That(metrics?[0].Type, Is.EqualTo("numeric"));
Assert.That(metrics?[0].Title, Is.EqualTo("Average temperature"));
Assert.That(metrics?[0].Description, Is.EqualTo("Average temperature of the sensor"));
Assert.That(metrics?[0].Aggregation, Is.EqualTo("avg"));
Assert.That(metrics?[0].Expression, Is.EqualTo("temperature"));
// Second metric checks
Assert.That(metrics?[1].Name, Is.EqualTo("max_temperature"));
Assert.That(metrics?[1].Type, Is.EqualTo("numeric"));
Assert.That(metrics?[1].Title, Is.EqualTo("Maximum temperature"));
Assert.That(metrics?[1].Aggregation, Is.EqualTo("max"));
Assert.That(metrics?[1].Expression, Is.EqualTo("temperature"));
});
}

@Seddryck Seddryck merged commit 7d63cf1 into main May 9, 2025
7 checks passed
@Seddryck Seddryck deleted the feat/metrics branch May 9, 2025 20:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants