t1gor / Robots.txt-Parser-Class

Php class for robots.txt parse

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Suggestion: Return null if rule parsing error?

JanPetterMG opened this issue · comments

Sometimes, when the robots.txt file cannot be parsed properly, we get problematic results. This may happen on bad robots.txt files, and doesn't always mean there is a issue with the code.

For example:
isAllowed('/') = (bool) false
and
isDisallowed('/') = (bool) false

For now, the only way to check for errors, is to check both isAllowed and isDisallowed and then verify if the results looks normal.
What about doing this before returning the result? and return a NULL value for example?

Any thoughts?

Pull Request #25

I would rather go with throwing an Exception if the file is bad, but I guess for now null is ok. As already commented on the pull request - please fix the test also. Otherwise - the code looks ok 👍 thanks for the effort!

I agree, better to throw an exception. I just updated my own code locally, and I'll submit it once I get PHPUnit up and running. Newer used it before, so it may take some time to figure out how it works. (im not a pro, just a hobby developer)...

Ok. Should be merged. Can we close this?

Yup, now it's all good :)