You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This destroys statistics... it would start counting robots... that will clutter the stats with many non-user data. I suggest to keep it away or document it only for people how understand what they are doing...
Normally you don't like to track robots - it's much more important to understand how users are walking a site. You will not earn money by robots crawling your site (you loose money - you need to pay for CPU time and traffic) - but users may give you something back. :-)
If you differ between bots and and humans and store and use the data respectively, you get further data:
2nd: which Bots and Spiders visit your page?
from this you can see how popular your site is (the more engines the broader the reach) and together with
3rd: you see the frequency of bots and spiders
you can see how your site develop or identify problems, e.g. when Google Bot visits you significantly less than the weeks before.
Eventually you can get useful information indeed as long as you don't mix it up with the "usual" statistics.
An UA string is not save. Robots sometimes comes as IE to get access to a site. Additional it is nearly impossible to make a list of all available robots and Piwik does not distinguish between UA's. If it would do - than it would be good, but I do not think this is implementable in near future.
PS: 99% of all users have JS enabled - if not they are not able to browse the new "Web 2.0" at all. They will not be able to see many sites... I know that a few big companies have switched off JS on their desktops in past... I wouldn't care about them any longer.
Note: restoring the noscript means puttint it back, it doesn't mean "count visits with JS disabled". Piwik has never counted visits with JS disabled and won't in core - but plugins could use this and report on bots, etc.