roach-php / core

The complete web scraping toolkit for PHP.

Home Page:https://roach-php.dev

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

spatie/robots-txt overwrites default Laravel robots.txt

Xoshbin opened this issue · comments

I'm not sure if this issue supposed to be written here or in the https://github.com/spatie/robots-txt package !, but since I'm using this package and it depends on spatie/robots-txt. I will write it here.
I just discovered that all the URL's in my website is not indexed and blocked by robots.txt, after digging it out the only thing that I found overwriting the default robots.txt file in Laravel is spatie/robots-txt, I'm not using spatie/robots-txt directly I'm just using https://github.com/roach-php.
Any help or confirming this issue would be helpful.

I looked into the spatie package, and it is only for parsing robots.txt, I believe it should not generate one in your webroot. Can you create a minimal reproducible example?