-
Notifications
You must be signed in to change notification settings - Fork 1
Tutorial
This tutorial should give an introduction to trigger algorithm developers how to set up the environment needed, make changed, compile, build and run the system. The tutorial is based on this presentation, which also provides further information regarding the triggeralgs module.
DUNE DAQ consists of several Modules that themselfs consist of Plugins. In this tutorial, we will see how to create a plugin for the triggermodules module.
Detailed documentation can be found at https://github.com/DUNE-DAQ/appfwk/wiki/Compiling-and-running-under-v2.2.0
We want to work with a specific tag/version, and within a specific working directory. In our example, we use:
export VERSION=dunedaq-v2.2.0
export MYDIR=dunedaq
Set up daq-builtools via your shell:
git clone https://github.com/DUNE-DAQ/daq-buildtools.git -b $VERSION
source daq-buildtools/dbt-setup-env.shThe expected answer is
Added /your/path/to/daq-buildtools/bin to PATH
Added /your/path/to/daq-buildtools/scripts to PATH
DBT setuptools loaded
mkdir $MYDIR
cd $MYDIR
dbt-create.sh $VERSIONThis may take a couple of minutes and will create your working area in the $MYDIR location.
DUNE DAQ consists of several mod- ules. The most important one is appfwk that provides the basic func- tionalities.
The data selection is based on two modules: triggermodules and triggeralgs.
The source code for the modules are located in sourcecode. Following commands can be run to check out all the required modules (triggeralgs has to be compiled before triggermodules):
dbt-setup-build-environment
cd sourcecode
git clone https://github.com/DUNE-DAQ/triggeralgs.git
dbt-build.sh --install
MODULES="appfwk triggermodules cmdlib ers filecmd listrev restcmd"
for MODULE in $MODULES
do
git clone https://github.com/DUNE-DAQ/$MODULE.git
cd $MODULE
git fetch;git checkout $VERSION
cd ..
done
dbt-build.sh --installYou should now have a complete working environment!
In this tutorial, we will use
$MYDIR/sourcecode/DAQDuneTrigger/Plugins/ TriggerPrimitiveFromFile.cpp as an example. We reference to $MYDIR/sourcecode/DAQDuneTrigger as the usual folder, if not indicated otherwise.
It simply takes a (shorter) waveform from a csv (comma-separated values) file indicated by the user, formatted as follows:
| time_start | time_over_threshold | time_peak | channel | adc_integral | detid | type |
|---|---|---|---|---|---|---|
| value, | value, | value, | value, | value, | value, | value, |
| ... | ||||||
| ... |
moo is a tool for code generation based on schema files using either python or jsonnet.
A complete moo documentation is available
We want our module and our plugins to be configurable.
Typically, we use three different kinds of moo source files in the schema folder of a module.
- the plugin schema
- the plugin make
- the app / command facility
We will give a short introductions to all of them. An exhaustive documentation from Brett is available here.
A plugin requires a schema file if it should be configured. It defines which data format the configuration object follows. Let's have a look at the file schema/test_schema.jsonnet:
local types = {
pathname : s.string("Path", "path",
doc="File path, file name"),
conf: s.record("Conf", [
s.field("filename", self.pathname,
"/tmp/example.csv",
doc="File name for trigger primitives"),
], doc="TriggerPrimitiveFromFile configuration"),
};
moo.oschema.sort_select(types, ns)The conf defines the configuration objects, that includes field. There can be as many fields as required. In our case, we use the file name as a field, whose data type is defined in pathname.
This schema object can be called after compilation as a nlohmann::json object. Let's have a look inside plugins/TriggerPrimitiveFromFile.cpp how to get the configuration object:
...
void TriggerPrimitiveFromFile::do_configure(
const nlohmann::json& config /*args*/)
{
auto params = config.get<triggerprimitivefromfile::Conf>();
filename = params.filename;
}
...As you can see, the configuration object is passed as config and is gotten using config.get<triggerprimitivefromfile::Conf>(). Its members can be called individually.
The compilation is done automatically during the build process. Please be sure to include Nljs.hpp in the src/triggermodules/$YOURMODULE directory in your plugin file.
If there is the need for manual code generation, the two commands run from Shell are:
moo -g ’/lang:ocpp.jsonnet’ -M schema -A path=dunedaq.DAQDuneTrigger.triggerprimitivefromfile -A ctxpath=dunedaq -A os=DAQDuneTriggers-TriggerPrimitiveFromFile-schema.jsonnet render omodel.jsonnet ostructs.hpp.j2 > plugins/Structs.hpp
moo -g ’/lang:ocpp.jsonnet’ -M schema -A path=dunedaq.DAQDuneTrigger.triggerprimitivefromfile -A ctxpath=dunedaq -A os=DAQDuneTriggers-TriggerPrimitiveFromFile-schema.jsonnet render omodel.jsonnet onljs.hpp.j2 > plugins/Nljs.hppThe moo make file provides a simple method for generating the configuration object from the app. In our example, it looks like this:
{
conf(filename) :: {
filename: filename,
},
}The method takes filename as a variable and passes it to the object. This moo make file is distinct from the overall daq-cmake file.
The app file is the command facility passed to the DUNE DAQ application and therefore the main file defining the overall functionality. Let's look at some excerpts of the code, which you can find in the schema folder (e.g. full_csv_trigger_app.jsonnet.
Please be sure to include the make file on the top:
...
local TPsGenerator = import "test_make.jsonnet";
...You will then define your queues that build data sources and sinks for your algorithms:
local queues = {
TPsQueue: cmd.qspec("TPsQueue",
}
"FollyMPMCQueue",
1000),
TAsQueue: cmd.qspec("TAsQueue",
"FollyMPMCQueue",
100),
TCsQueue: cmd.qspec("TCsQueue",
"FollyMPMCQueue",
10),You will define the name of the queue, its time and its size.
Then, you will define the plugins or modules:
...
TPsGenerator: cmd.mspec("TPsGenerator2",
"TriggerPrimitiveFromFile",
[cmd.qinfo("output",
"TPsQueue",
cmd.qdir.output)]),
TAsGenerator: cmd.mspec("TAsGenerator",
"DAQTriggerActivityMaker",
[cmd.qinfo("input",
"TPsQueue",
cmd.qdir.input),
cmd.qinfo("output",
"TAsQueue",
cmd.qdir.output)]),
...Here, each plugin has a name used in the app, will point to an existing file, and have its sinks and sources defined via cmd.qinfo. cmd.qinfo requires the name of the source/sink, the respective queue and the direction (whether it is a source (input) or sink (output)).
In the configuration you define what happens if you execute the init, conf, start or stop from the app.
In the init command, you define which modules and queues you are getting set up:
...
cmd.init([
queues.TPsQueue,
queues.TAsQueue,
queues.TCsQueue],
[ modules.TPsGenerator,
modules.TAsGenerator,
modules.TCsGenerator])
{ waitms: 1000},
...In the conf command, you actually indicate which specific algorithm with which configuration you are setting up. In our example, you will indicate the csv file name here. This is set up via cmd.mcmd:
...
]
cmd.conf(
[cmd.mcmd("TPsGenerator3",
TPsGenerator.conf("/tmp/csv_file.csv")),
cmd.mcmd("DAQTriggerActivityMaker"),
cmd.mcmd("DAQTriggerCandidateMaker"),)]
...
Last but not least, you also want to set up the start and stop commands:
...
cmd.start(40){ waitms: 1000},
cmd.stop(){ waitms: 1000},
...How the commands are defined is given by the appfwk module, and please refer to the appfwk documentation for further information.
The app needs to be compiled to a command facility before it can be executed. From $MYDIR, run
moo compile sourcecode/triggermodules/schema/test_app.jsonnet > test_compiled.json
Building the modules requires that a valid daq-cmake command list exists (CMakeLists.txt). For triggermodules, this looks as follows:
cmake_minimum_required(VERSION 3.12)
project(triggermodules VERSION 2.2.0)
find_package(daq-cmake REQUIRED )
daq_setup_environment()
find_package(appfwk REQUIRED)
find_package(triggeralgs REQUIRED)
##############################################################################
daq_add_plugin(TriggerPrimitiveRadiological duneDAQModule LINK_LIBRARIES appfwk::appfwk SupernovaTrigger)
daq_add_plugin(TriggerPrimitiveFromFile duneDAQModule SCHEMA LINK_LIBRARIES appfwk::appfwk SupernovaTrigger)
...You are defining which packages are requires and which plugins you are setting up. Please pay attention to the fact that for plugins with enables configuration, you have to add the SCHEMA keyword.
Now you should be ready for build. If not done so, load the build environment from shell:
dbt-setup-build-environmentIf you want to do a clean install from scratch, execute
dbt-build.sh --clean --installotherwise, you can just run
dbt-build.sh --installIn order to run, you have to load the run environment first:
dbt-setup-runtime-environmentNow, you should be able to run by passing the command facility you compiled from the app:
daq_application -c test_compiled.json
Now the commands init | conf | start | stop should appear. Let’s run the commands in this order to test the commands. Please verify that the data products are produced and received.