Skip to content

[Suggestion] Improve TextMessageWriter output for numeric values #3563

Closed
@nzain

Description

@nzain

When comparing double values it is sometimes difficult to see why the test failed. Example:

        [Test]
        public void CompareDoubleValues()
        {
            double actual = -25.148640873240467;
            double expected = -24.685244717451383;
            const double tolerance = 0.4;
            Assert.That(actual, Is.EqualTo(expected).Within(tolerance));
        }

results in the following message:

  Expected: -24.685244717451383d +/- 0.40000000000000002d
  But was:  -25.148640873240467d

It is immediately obvious, that the numbers differ. And you trust nunit to properly handle the tolerance. However, to fix the issue you have to understand what's wrong with my number? Is my number too large? Oh wait, they are signed. It's actually too small. Was my tolerance reasonable? Well, what is the actual difference?

Proposed Message
If we have two values and a tolerance, it would be great to see the actual difference as well. In my oppinion, that would improve the readability significantly for weird numbers or tiny tolerances.

  Expected: -24.685244717451383d +/- 0.40000000000000002d
  But was:  -25.148640873240467d  (differs by -0.463396155789084)

Of course I can output the difference on my own, e.g. using nunit's message argument. However, I think that extra work could be done once in nunit instead of every time on the developer side.

Remarks
nunit 3.12 (latest at the time of writing)
Visual Studio 2017 professional + ReSharper to run the tests
or dotnet core with dotnet test

I wanted to create a PR, but VS2017 cannot compile the solution.
The relevant parts of the code should be

writer.DisplayDifferences(expected, actual, tolerance);

and
public override void DisplayDifferences(object expected, object actual, Tolerance tolerance)
{
if (expected != null && actual != null && expected.GetType() != actual.GetType() && MsgUtils.FormatValue(expected) == MsgUtils.FormatValue(actual))
{
_sameValDiffTypes = true;
ResolveTypeNameDifference(expected, actual, out _expectedType, out _actualType);
}
WriteExpectedLine(expected, tolerance);
WriteActualLine(actual);
}

but I might be wrong, since I couldn't step into the code.

Detection of numerical values is probably required to work with object arguments - there is nice code available in Numerics.cs. This class could be extended with methods like GetDifference(object expected, object actual, Tolerance tolerance).

PS: Thanks for this awesome test framework!

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions