-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add automated regression test #4
Comments
We can add the new test case(s) to the existing |
(See also #14, for the ability to load |
For this test it's OK to use a different Note however that we cannot use a blend file with the TRI logo. Any test data we add to this repository will be open-sourced, and we shouldn't open-source our logo. |
The models included in the blend file will not appear in the input glTF file, though. Thus, there may be two types of tests: (1) Just test the glTF-in-image-out (or load a blend file without any geometry but lighting) with a golden image and a matching glTF file. The first one is very clear to me, but I am not sure the second scenario should be tested as well. |
As I understand it, the typical use case is that the user would be loading a blend file in most cases. As such, a regression test with glTF-RPC + blend-preloaded seems like the most important regression test to accomplish. To help us isolate and debug problems, having a regression test with glTF-RPC only and no blend file is perfectly fine, and might even be the ideal test to add to get us bootstrapped. Or if users would often omit a blend file entirely, then it's a nice regression test to cover them also. When testing with a blend file, I imagine it should have both lighting settings as well as actual textured geometry. That's a use case we expect users to use, so we need to make sure the images come out correctly in that case. |
Another thought: we should actually check two RPC requests in a row (using at least slightly different glTF requests). We want to be sure that no state from first request bleeds into the second request's resulting image. |
Thoughts after trying to resolve label image differencing. Context: From poking around our Blender server, there is no obvious way, to me, to disable lighting interaction for a textured mesh programmatically. Additionally, if we are targeting a Drake use case, rendering a glTF with textured meshes to a label image is not actually a valid use case, IMO. Thus, would it make sense to test things separately with slightly different resources while trying to share the common ones? My current proposal is:
That would cover all the use cases other than the textured mesh is embedded in the blend file. |
Sure, I like these.
When the user is adding a static On the other hand, I don't think its a big deal? Per RobotLocomotion/drake#18311 the |
What you would suggest to test the blend file loading? Maybe a |
The end-user story goes like The first question is to decide how the server implements label images. Either it needs to not load the blend file in that case, or it needs to somehow disable the blend file. Once you've made that design choice, the unit test can be crafted to match it. |
With a sample
*.gltf
+*.blend
file, use https://docs.python.org/3.10/library/urllib.request.html to call the RPC and check the image that comes back.Note that in particular, no Drake code is used in this automated regression test.
The text was updated successfully, but these errors were encountered: