JayBizzle / Crawler-Detect

🕷 CrawlerDetect is a PHP class for detecting bots/crawlers/spiders via the user agent

Home Page:https://crawlerdetect.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

The ability to extend crawlers list for monitoring-like systems?

s-chizhik opened this issue · comments

Hello guys.

First of all, thank you for your package. It is really helpful for projects I've been doing.

Literally, I'm referencing existing issue #309 but from a little different point of view. My goal is not to simply add a few exotic bots to the list. I aim to exclude the custom-configured monitoring systems that we use like Zabbix, Munin, etc. They perform health checks using HTTP requests with specific UA suffix, which we are configuring by ourselves. It can be like, ${PROJECT_NAME}Monitoring, ${DOMAIN}Robot, ${NODE_NAME}Zabbix, etc.

And at this point, we're stuck. Cause of:

  • the code of this package doesn't imply extensibility of the crawlers list
  • adding our-specific UA suffixes to the crawlers list will not make any benefit to the community

So my question: is any chance that you'll review your position about code extensibility or maybe you have any workaround?

As for now, I see the one and a little dumb way to get the goal (excluding project fork) is to make custom classes extend from Crawlers and CrawlerDetect, so CustomCrawlers will contain extra lines for custom monitoring systems and CustomCrawlerDetect will use it.

Sorry for the late reply. We have discussed this internally again and we are still not sure it is something we would like to support.

How would you see this being implemented?

Hi - I have a use case for this also - what about a method addCustomUserAgentRegex(string $pattern) - which would then append it to the crawlers array?

If someone wants to put together a PR as an initial discussion point, we can take it from there