Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

script_json_communication example fails -- somewhat unusual #314

Closed
diablodale opened this issue Dec 29, 2021 · 5 comments
Closed

script_json_communication example fails -- somewhat unusual #314

diablodale opened this issue Dec 29, 2021 · 5 comments

Comments

@diablodale
Copy link
Contributor

script_json_communication example fails due to the word warning matching the test running regex. But it might be actually correct. But something else looks wrong...

...
[ctest] 45: [14442C10C1A1C6D200] [4.623] [Script(1)] [warning] Original: {'foo': 'bar', 'one': 1}`
...
[ctest] 45/64 Test #45: script_json_communication ...........***Failed  Error regular expression found in output. Regex=[\[warning\]]  5.75 

Setup

  • Microsoft Windows [Version 10.0.19044.1415]
  • VS2019 v16.11.8
  • depthai-core develop 84f7c69 + move/thread fixes
  • depthai-shared develop 549a01b69d7c7a9c9b37f4971f4be45e42a0dfed
  • xlink master eef5a51c608fef8e326c740f120b90a543e55b27

Repro

  1. config for building examples and tests
  2. build for debug, static lib, x64
  3. Run test suite or specific test script_json_communication

Result

[ctest] test 45
[ctest]       Start 45: script_json_communication
[ctest] 
[ctest] 45: Test command: "C:\Program Files\CMake\bin\cmake.exe" "-DTIMEOUT_SECONDS=10" "-P" "C:/njs/depthai-core/examples/cmake/ExecuteTestTimeout.cmake" "C:/njs/depthai-core/build/examples/script_json_communication.exe"
[ctest] 45: Environment variables: 
[ctest] 45:  UBSAN_OPTIONS=halt_on_error=1
[ctest] 45: Test timeout computed to be: 1500
[ctest] 45: -- found -P
[ctest] 45: -- arguments: C:/njs/depthai-core/build/examples/script_json_communication.exe;
[ctest] 45: [14442C10C1A1C6D200] [4.623] [Script(1)] [warning] Original: {'foo': 'bar', 'one': 1}
[ctest] 45: changedDict: {"foo":"baz","one":2}
[ctest] 45: -- After process executed,  produced the following exit code: 0
[ctest] 45/64 Test #45: script_json_communication ...........***Failed  Error regular expression found in output. Regex=[\[warning\]]  5.75 sec
[ctest] -- found -P
[ctest] -- arguments: C:/njs/depthai-core/build/examples/script_json_communication.exe;
[ctest] [14442C10C1A1C6D200] [4.623] [Script(1)] [warning] Original: {'foo': 'bar', 'one': 1}
[ctest] changedDict: {"foo":"baz","one":2}

Expected

No failure -- test to pass

Notes

Clearly the test runner is matching its regex on the line

[ctest] 45: [14442C10C1A1C6D200] [4.623] [Script(1)] [warning] Original: {'foo': 'bar', 'one': 1}

🤷‍♀️But that line is in the python script node -- itself not indicating a problem...just outputting a log message.

node.warn('Original: ' + str(dict))

🧐But where is the log output also in python of

node.warn('Changed: ' + str(dict))

I would expect to also see that log in the output. Something like...

[ctest] 45: [14442C10C1A1C6D200] [4.623] [Script(1)] [warning] Original: {'foo': 'bar', 'one': 1}
[ctest] 45: [14442C10C1A1C6D200] [4.624] [Script(1)] [warning] Changed: {'foo': 'baz', 'one': 2}

Instead, I only see the output from the C++ code

cout << "changedDict: " << changedDict << "\n";

I would expect the following...

  1. Adjust the logging output so it does not trigger the test running regex
  2. See both log lines from the python script
  3. The C++ code to compare the json received from the node to the expected json. Something similar to the following...
    ...
    cout << "changedDict: " << changedDict << "\n";
    nlohmann::json expectedDict{{"one", 2}, {"foo", "baz"}};
    if (expectedDict != changedDict) return 1;
    return 0;
@themarpe
Copy link
Collaborator

themarpe commented Jan 4, 2022

Thanks for reporting - will fix this.
Regarding the missing log line, it boils down to host side exiting before all log messages arrive and are printed out. Will address this as well to not create any confusion in the specific example

@themarpe
Copy link
Collaborator

themarpe commented Jan 4, 2022

Fixed in 3ca0373

@themarpe themarpe closed this as completed Jan 4, 2022
@diablodale
Copy link
Contributor Author

Hi. I would like the test to verify the results like step 3 code in OP. Can I submit a PR to do that or do you think otherwise?

@themarpe
Copy link
Collaborator

themarpe commented Jan 7, 2022

Yes, I think it would be suitable. If you mean in the example, make the test "simple" with a print telling the issue/fail and success

@diablodale
Copy link
Contributor Author

diablodale commented Jan 7, 2022

Instead, to adjust the code so it validates the test for automation and CI! Not humans.
This validates a lots of things in one test. xlink, parsing on host and device, scripting on device, etc.
It isn't good enough that some random json came back from the device. Instead, it needs to be the correct json.

pseudocode

..
nlohmann::json expectedDict{{"one", 2}, {"foo", "baz"}};
if (expectedDict != changedDict) return 1;
return 0;

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants