Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build against daq buildtools v1.2.x #6

Merged
merged 15 commits into from
Nov 10, 2020

Conversation

alessandrothea
Copy link
Contributor

Subject says it all.

@bieryAtFnal
Copy link
Collaborator

I have followed the instructions for Compiling and running under v1.2.1, added the listrev repo underneath the sourcecode directory, and successfully built all of the software packages.

I see that the 'Compiling and Running' instructions say that the old command for running a DAQProcess needs to be updated, but they don't suggest a new command. Should the approval of this PR include a verification that the listrev apps can be run?

Something else that I noticed: the JSON files from listrev/test aren't installed anywhere. Is that expected? (In earlier versions of some packages, we copied the JSON files to a build or install directory, I think.)

@alessandrothea
Copy link
Contributor Author

Thanks for the comments.

  • Yes, a proper replacement for the QueryResponseCommandFacility must be provided. It's one of the points to be discussed at today's CCM's meeting.
  • Right, this is not automatically taken care of by the new daq-buildtools functions but should definitely be added.

@brettviren
Copy link
Member

Just some more info related to:

a proper replacement for the QueryResponseCommandFacility must be provided. It's one of the points to be discussed at today's CCM's meeting.

The code and README of this package shows one approach:

https://github.com/brettviren/dune-daq-repl

This can do similar to talk to the REST server command facility but as of now it talks to the FileCommandFacility. I'm working to allow this Python based approach to work with the Python based object construction which is also still a WIP.

After one small fix which I need to PR to cmdlib, a second, dead simple approach will be possible by which we simply cat JSON holding a command to the FileCommandFacility via file system FIFO. You can actually do that now, but currently FCF will exit when its input file is closed so you only get one command. In general, FCF should take a "run until SIGINT" policy so this needs fixing in any case.

@bieryAtFnal
Copy link
Collaborator

Hi,
I now understand a little better what I should be testing and looking for, so I'd like to start fresh with a set of observations and comments.

In my current testing, I'm using the "Compiling and Running under v1.2.1" instructions.

  1. I wonder if the instructions for Running on the 'Compiling and Running' page should be updated to include a command like the following:
    1. daq_application --commandFacility file://sourcecode/appfwk/schema/fdpc-job.json
    2. I ask this question based on something that I saw on Brett's "DAQ Object Schema" page
  2. As we've mentioned before, at the moment the above command to run an application fails with an inability to find "fileCommandFacility"
  3. I now suspect that the old test/list_reversal_app.json is obsolete. If this is true, then it should be removed/archived. [This comment supercedes my earlier one about copying this file to the build directory.]
  4. I wonder how one feeds transition commands into the daq_application in the new command model. In ddpdemo apps, we make use of the fact that we can have the system in the "running" state for several seconds before issuing a Stop command, and it would be great to continue to have this feature.

@brettviren
Copy link
Member

  1. I think so.

  2. People keep moving FCF around and @roland-sipos suggests yet another move here: Shell repl and other fixes cmdlib#4. Also, I don't know where each branch is right now but I know in general that in my own tests that I need to hand tune LD_LIBRARY_PATH to pick up plugins.

  3. Yes, the format is obsolete. This example should be reformatted so the content is kept. I'll look at doing that.

  4. Take a look at: https://brettviren.github.io/dune-daq-repl/ddrepl.html

@bieryAtFnal
Copy link
Collaborator

I noticed that the "reasonable defaults" were removed from RandomDataListGenerator, and that seems unfortunate to me.

I would like to learn more about how parameters can be allowed to be optional in the "job" JSON, and it seems like the 'reasonable defaults' could have been part of demonstrating how that is done...

@brettviren
Copy link
Member

Hi @bieryAtFnal

I think you refer to the two values set here:

https://github.com/DUNE-DAQ/listrev/blob/master/src/RandomDataListGenerator.hpp#L64

They are moved into the schema as default values for specific fields in a record:

https://github.com/brettviren/listrev/blob/config-proto/schema/listrev-rdlg-schema.jsonnet#L13

These should show up as default values set in the definition of the C++ struct:

https://github.com/brettviren/listrev/blob/config-proto/src/rdlg/Structs.hpp#L25

However, clearly they are not there. This was working at some point so I must have introduced some regression. I'll fix it through:

brettviren/moo#2

Thanks for finding this!

@bieryAtFnal
Copy link
Collaborator

Hi @brettviren ,
Thanks! that information definitely helps.

I'm starting to build up an understanding of the different default values for parameters in the code generation model.

  1. From what you described, I now understand that the default values that we provide in our *-schema.jsonnet files (e.g. listrev-rdlg-schema.jsonnet should show up in the generated C++ header.

  2. I believe that there are also default values for configuration parameters in our *-make.jsonnet files (e..g listrev-rdlg-make.jsonnet). My sense is that these will only show up in generated JSON files, and only then if the calls to the configuration-creation methods in *-job.jsonnet (e.g. listrev-job.jsonnet) files don't provide explicit values for each parameter.

  3. As far as how the C++ defaults get used, I only have a guess as to how that works, but hopefully it's a reasonable guess. When a Conf object is created in our C++ user code (by way of Nljs.hpp), if there isn't a value for a given parameter in the JSON that was used to run the job, the default will be assigned to the corresponding data member in the Struct. The JSON "job" files that we create using "moo compile" commands won't typically have unspecified parameter values, but JSON files that are edited or created in other ways might.

@brettviren
Copy link
Member

Yes, I think you have the picture.

  1. yes

  2. yes. Providing default values for Jsonnet function args means the caller of the function can omit those args. Just like Python functions. Also, the *-make.jsonnet file can instead be provided by an equivalent Python file. The Python may likewise provide defaults through function arguments. How this "make" layer should "look" is still in flux and since it will be a "human touching point" it probably needs a few tries to get "right". It may even have more than one way of doing things. And, eventually, some sort of "database" will be in the picture for operation while hand-rolling will be always needed for development and testing.

  3. yes. Modulo the bug you found in the C++ struct generation, the defaults will be baked in there so that when a struct is default-constructed a default field value is applied then. When the Nljs serialization is done we may decide some policy. Eg, if a JSON object is presented which lacks an attribute for a field which itself has no default, we should throw and error. When the field has a default and the JSON object lacks the key then Nljs layer can set the default - though this will be redundant with setting a default at construction. Redundancy in data is sometimes bad so we might want to think what policy we actually want for "default management".

@bieryAtFnal
Copy link
Collaborator

Out of curiosity, what is the daq-buildtools function to build a second or third source code library in a package? I see that daq_add_library() should be used to "produce the main library provided by a project", but what if a package would like to produce several non-plugin libraries?
Thanks,
Kurt

@alessandrothea
Copy link
Contributor Author

@bieryAtFnal For the moment we have adopted a simplified model where a package provides a library at most.
This can be changed if the need arises, but it's a topic for another forum (daq-buildtools)

@alessandrothea alessandrothea merged commit 7497c57 into master Nov 10, 2020
@alessandrothea alessandrothea deleted the jcfreeman2/build_against_daq-buildtools_develop branch November 10, 2020 14:48
@alessandrothea
Copy link
Contributor Author

Apologies, I have closed this PR a little too abruptly. The title made me forget the ongoing discussion of the configuration aspects, which was part of this PR but is not strictly related to daq-buildtools v1.2.1.
As un-merging is not so trivial, may I suggest to move the remaining discussions around the schema to an issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants