You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Kernel version (if applicable): 0.5.11-0.175.3.35.0.6.0
Expected behavior
(Description of the behavior / output that you expected)
After launching the daemon the log file reports several lines like following:
Actual behavior
The csv files and log file aren't being written properly
(Description of the behavior / output that you observed)
The log file reports errors like the following for a short period of time just after the start. After that, the log file and the csv files (from csv_plugin) are not updated anymore:
[2021-03-09 14:46:38] csv plugin: fopen (/diag/collectd/csv/hcdsns-be-jee-r02a-id-c04-m1-n2/DataSource-opss-audit-viewDS/gauge-value-ResolvedAsCommittedTotalCount-2021-03-09) failed: Too many open files
Expected behavior
(Description of the behavior / output that you expected)
After launching the daemon the log file reports several lines like following:
Actual behavior
The csv files and log file aren't being written properly
(Description of the behavior / output that you observed)
The log file reports errors like the following for a short period of time just after the start. After that, the log file and the csv files (from csv_plugin) are not updated anymore:
[2021-03-09 14:46:38] csv plugin: fopen (/diag/collectd/csv/hcdsns-be-jee-r02a-id-c04-m1-n2/DataSource-opss-audit-viewDS/gauge-value-ResolvedAsCommittedTotalCount-2021-03-09) failed: Too many open files
Steps to reproduce
More info:
the process is running under a Solaris project:
collectd:107:Collectd Agent:::process.max-file-descriptor=(basic,16384,deny),(priv,32768,deny)
The text was updated successfully, but these errors were encountered: