-
Notifications
You must be signed in to change notification settings - Fork 454
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is it possible to store statistical data permanently? #50
Comments
Thought 👍 I thought about achieving the aim outlined above and came to the following solution. I need your opinion regarding this.. I can record the desired data, if I am not wrong, by creating a small application (e.g. PHP script) that reads the JSON file (the status page url) every 5 minutes and stores the data in a MySQL database. The data structure might look like this: month (08-2016) | domain (example.com) | sent (bytes) | received (bytes) The script checks what month is it now and updates the affected month's row (taking account for domain name) What do you think about this solution? Thanks :-) |
Thanks suggestion. |
@vozlt Thank you very much for the work put into this module. When it comes to updating, I meant updating an independent MySQL database table to store the statistical data in, not the module's data stored in memory. No need to update module's values, it should be enough to read the data and then store it somewhere else (a MySQL database for example). I have already achieved my goal with an outline like this: MySQL database table: main columns: zone (varchar), nginx main process start time (int - UNIX time-stamp), month (varchar like 08-2016) = these three columns should be defined as composite primary key. A small script (php, java..) reads the JSON data (provided by the module) every 1 minute and updates the MySQL table. That's all 👍 Additional columns: bytesIn, bytesOut updated on every reading (every minute). The updating script can be scheduled to run using CRON. |
OK, thanks for detailed description. |
@acrolink - I wrote a collectd plugin for nginx_vts https://github.com/bobthemighty/collectd-vts - we send the metrics to riemann and then influx, but collectd also has a mysql output plugin, I think. |
hi @acrolink, can you share we us your PHP solution? I have the same problem. Thank you. |
@nottix .. Sure that should not be a problem.. I wrote it in |
Thank you ;-) |
Here you go: I will add README in the next few days.. P.S. you need to install |
I'll try today, thank you ;-) Il 09 set 2016 9:54 PM, "acrolink" notifications@github.com ha scritto:
|
I have just added |
I added that feature as mentioned above. http {
vhost_traffic_status_zone;
vhost_traffic_status_dump /var/log/nginx/vts.db;
...
server {
...
}
} Please see the vhost_traffic_status_dump directive for detailed usage. Latest commit: fc73722 |
Perfect. Great work @vozlt thank you. I will happily test and report back in the near future. |
I have noticed that if the nginx service is restarted, the data gets lost. Is there a permanent storage for the statistical data? With time period filters, e.g. Aug 01 until Aug 31 .. ?
The text was updated successfully, but these errors were encountered: