-
Notifications
You must be signed in to change notification settings - Fork 129
Bugfix: Houdini render split bugs #6037
Bugfix: Houdini render split bugs #6037
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried testing with OP first before testing in AYON(for easier to spot the issue), my vray isn't compatible with my current version of houdini and i dont have Arnold llicense so can't really test on these two.
r without exportJob in the instance.data.
But all seems okay when it is turned off.
Will also do the AYON test and update you later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will approve this as it updates some latest tags in regard to the Houdini update.
The missing instance data of exportJob and iFdFiles need to be fixed for render_rops for sure but it is for different PRs.
I think we should revise the logic implemented in this PR #5420.. I didn't expect that its logic will break other product types. |
It isn't. It's only breaking the render ROPs(farm cache looks okay) and we need to do an enhancement upon it for sure. |
Co-authored-by: Roy Nieterau <roy_nieterau@hotmail.com>
yeah, I had to check plugin info to know which is which..
Maybe @moonyuet could help. could you give it a test in RS. |
As I tested with Fabia's PR on export job with redshift for example, the rs files would be processed in the farm to render out the images. In other words they are using redshift standalone to render the job and once the renders finished, the rs files are gone and the rendered images would be located in publish folder |
I think I've managed to fix it! |
The exported jobs - should not be published. So seeing you have "mantraifd" products that looks wrong. Those are intermediate rendering files, not publishes.
Correct, BUT environment variables required for plugins for redshift standalone and arnold standalone different from the ones for Maya. So say you'd render with Arnold Standalone Render Plugin on Deadline we should now be setting the arnold standalone specific environment instead of the Maya Arnold Render Specific environment. Right? As such, we might need a way to use a different environment for that particular render job than the environment from maya itself. This might be less of a problem if when configuring the env for maya or houdini you're already also defining the environment for Arnold/Redshift standalone so that when during that render it uses Houdini's environment it also includes the vars required for Arnold standalone. The next question of course is. What's currently implemented to ensure that the Arnold an Redshift render jobs render the correct version of Arnold and Redshift as defined in the project. |
Sorry for inconvenience.. In my test, I published 4 things:
|
Correct me if I am wrong, but Arnold plugin on Deadline doesn't work with versions at all? Anyway, what I am missing in case of Arnold is publish job. |
That screenshot does also indeed seem to lack a publish job. it should be:
The output files of the export job are considered intermediate files and are never published but only used by the render job. Side note: Not sure how, aside of that, the publish logic behaves. But it's usually good practice to NOT delete the export job files and the render job files unless an explicit choice was made to do so. Because, when an issue occurs with e.g. a single frame or a few frames, then those will need to be re-rendered as opposed to re-rendering the full sequence. And then of course, once the frames are fixed, the publish job should be re-entrant to be able to publish again the fixed version (even if it had published before). The above differentation about what each JOB type roughly is or does, and how it's intended to be used should be clearly worded in the documentation if it isn't yet. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree with @BigRoy about the labeling of the jobs, also missing publish jobs are issue.
With Arnold ROP without job splitting, I have:
Traceback (most recent call last):
File "C:\Users\annat\AppData\Local\Ynput\AYON\dependency_packages\ayon_2312141707_windows.zip\dependencies\pyblish\plugin.py", line 527, in __explicit_process
runner(*args)
File "C:\Users\annat\Documents\Projects\Ayon\OpenPype\openpype\modules\deadline\plugins\publish\submit_houdini_render_deadline.py", line 280, in process
super(HoudiniSubmitDeadline, self).process(instance)
File "C:\Users\annat\Documents\Projects\Ayon\OpenPype\openpype\modules\deadline\abstract_submit_deadline.py", line 471, in process
render_plugin_info = self.get_plugin_info(job_type="render")
File "C:\Users\annat\Documents\Projects\Ayon\OpenPype\openpype\modules\deadline\plugins\publish\submit_houdini_render_deadline.py", line 251, in get_plugin_info
InputFile=instance.data["ifdFile"]
KeyError: 'ifdFile'
I thought that we've already fixed that...
damn I am stupid.... wrong PR checked out, please disregard. Meanwhile, I'll kick myself. |
NOTE: I think we should follow pyblish key naming camel case convention in instance.data and context.data. Thus |
I've fixed my issue with bf0ad72 as it seems that |
It works on my side. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code LGTM
Changelog Description
This PR is a follow up PR to #5420
This PR does:
get_output_parameter
to what is used to be.exportJob
flag tosplit_render
Testing notes: