Join GitHub today
GitHub is home to over 50 million developers working together to host and review code, manage projects, and build software together.
Sign upGitHub is where the world builds software
Millions of developers and companies build, ship, and maintain their software on GitHub — the largest and most advanced development platform in the world.
Disable logging when page is loaded by webcrawler #52
Comments
|
Have you considered only loading the Loggly logger when the user agent is not the web crawler? You'll have to look at the logs to see what the user agent of the web crawler is in order to exclude it. This is something you can do on your end and doesn't necessarily need to be in the library. Some people may want to know about errors when a crawler runs but if you don't you could add code to prevent this. |
|
Thanks for the response, does loggly capture the userAgent by default or is that something I should send with events ? |
|
I don't think so, you should send it. We usually capture it with access logs, such as coming from a web, application or proxy server. |
Is there a way that I can disable my loggly logger whenever a webpage is loaded by a web crawler ? I have a Javascript Lib that logs using loggly. It is installed on some client sites, when their sites get loaded by a webcrawler some "errors" happen and force loggly to log a bunch of false positives. How can I handle this ?
The text was updated successfully, but these errors were encountered: