Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to store statistical data permanently? #50

Closed
acrolink opened this issue Aug 5, 2016 · 13 comments
Closed

Is it possible to store statistical data permanently? #50

acrolink opened this issue Aug 5, 2016 · 13 comments

Comments

@acrolink
Copy link

acrolink commented Aug 5, 2016

I have noticed that if the nginx service is restarted, the data gets lost. Is there a permanent storage for the statistical data? With time period filters, e.g. Aug 01 until Aug 31 .. ?

@acrolink
Copy link
Author

acrolink commented Aug 5, 2016

Thought 👍

I thought about achieving the aim outlined above and came to the following solution. I need your opinion regarding this.. I can record the desired data, if I am not wrong, by creating a small application (e.g. PHP script) that reads the JSON file (the status page url) every 5 minutes and stores the data in a MySQL database. The data structure might look like this:

month (08-2016) | domain (example.com) | sent (bytes) | received (bytes)

The script checks what month is it now and updates the affected month's row (taking account for domain name)

What do you think about this solution? Thanks :-)

@acrolink acrolink changed the title Is it possible to store statistical data permanentlyf Is it possible to store statistical data permanently? Aug 6, 2016
@vozlt
Copy link
Owner

vozlt commented Aug 10, 2016

Thanks suggestion.
This module does not have that feature.
In other words, JSON data stored in memory is no method to update just now. (You know, to delete or reset is possible.)
I would consider to solve your suggestion. (e.g. /status/control?cmd=update)

@acrolink
Copy link
Author

acrolink commented Aug 10, 2016

@vozlt Thank you very much for the work put into this module. When it comes to updating, I meant updating an independent MySQL database table to store the statistical data in, not the module's data stored in memory. No need to update module's values, it should be enough to read the data and then store it somewhere else (a MySQL database for example). I have already achieved my goal with an outline like this:

MySQL database table:

main columns: zone (varchar), nginx main process start time (int - UNIX time-stamp), month (varchar like 08-2016) = these three columns should be defined as composite primary key. A small script (php, java..) reads the JSON data (provided by the module) every 1 minute and updates the MySQL table. That's all 👍

Additional columns: bytesIn, bytesOut updated on every reading (every minute).

The updating script can be scheduled to run using CRON.

@vozlt
Copy link
Owner

vozlt commented Aug 11, 2016

OK, thanks for detailed description.

@bobthemighty
Copy link

@acrolink - I wrote a collectd plugin for nginx_vts https://github.com/bobthemighty/collectd-vts - we send the metrics to riemann and then influx, but collectd also has a mysql output plugin, I think.

@nottix
Copy link

nottix commented Sep 8, 2016

hi @acrolink, can you share we us your PHP solution? I have the same problem. Thank you.

@acrolink
Copy link
Author

acrolink commented Sep 8, 2016

@nottix .. Sure that should not be a problem.. I wrote it in c#/mono not PHP. The working setup consists mainly of a statistics reading & storing executable and a small web app to display the data. The code is missing currently a vital mechanism which should detect the beginning of a new month and reset the counters accordingly.. I will fix that and make the full code available here within the next few days.

@nottix
Copy link

nottix commented Sep 8, 2016

Thank you ;-)

@acrolink
Copy link
Author

acrolink commented Sep 9, 2016

@nottix

Here you go:
https://github.com/acrolink/nginx-vts-records

I will add README in the next few days.. P.S. you need to install Mono for Linux, get nuget.exe, compile the executive (.exe) using Mono's xBuild and schedule it to run using cron..

@nottix
Copy link

nottix commented Sep 10, 2016

I'll try today, thank you ;-)

Il 09 set 2016 9:54 PM, "acrolink" notifications@github.com ha scritto:

@nottix https://github.com/nottix :-)

Here you go:
https://github.com/acrolink/nginx-vts-records

I will add README in the next few days.. P.S. you need to install Mono
for Linux, get nuget.exe, compile the executive (.exe) using Mono's xBuild
and schedule it to run using cron..


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
#50 (comment),
or mute the thread
https://github.com/notifications/unsubscribe-auth/AAFiyRvucJ7RBugckY71V1bSoFvA7zXIks5qobmFgaJpZM4JeGl_
.

@acrolink
Copy link
Author

I have just added .sql to create DB structure.

@vozlt
Copy link
Owner

vozlt commented Apr 4, 2017

I added that feature as mentioned above.
I haven't much test yet. 😆
It is vhost_traffic_status_dump directive as follows:

http {
    vhost_traffic_status_zone;
    vhost_traffic_status_dump /var/log/nginx/vts.db;

    ...

    server {

        ...

    }
}

Please see the vhost_traffic_status_dump directive for detailed usage.

Latest commit: fc73722

@acrolink
Copy link
Author

acrolink commented Apr 4, 2017

Perfect. Great work @vozlt thank you. I will happily test and report back in the near future.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants