There are 2 repositories under robots-parser topic.
NodeJS robots.txt parser with support for wildcard (*) matching.
An extensible robots.txt parser and client library, with full support for every directive and specification.
:robot: robots.txt as a service. Crawls robots.txt files, downloads and parses them to check rules through an API
Alternative robots parser module for Python
A lightweight robots.txt parser for Node.js with support for wildcards, caching and promises.
A lightweight and simple robots.txt parser in node
Visual App for Testing URLs and User-agents blocked by robots.txt Files
🤖 Ruby gem wrapper around Google Robotstxt Parser C++ library
A parser for robots.txt with support for wildcards. See also RFC 9309.
💧 Test your robots.txt with this testing tool. Check if a URL is blocked, which statement is blocking it and for which user agent. You can also check if the resources for the page (CSS and JavaScript) are disallowed!. Robots.txt files help you guide how search engines crawl your site, and can be an integral part of your SEO strategy.
RFC 5234 spec compliant robots.txt builder and parser. 🦾
Parse robots.txt and traverse sitemaps.
The repository contains Google-based robots.txt parser and matcher as a C++ library (compliant to C++17).
Typescript robots.txt parser with support for wildcard (*) matching.
robots.txt validator for python
Parsers for robots.txt (aka Robots Exclusion Standard / Robots Exclusion Protocol), Robots Meta Tag, and X-Robots-Tag
Fully native robots.txt parsing component without any dependencies.
A small, tested, no-frills parser of robots.txt files in Swift.
python binding for Google robots.txt parser C++ library