RobotsTxt is a package to dynamically create robots.txt files. It's made to work with Laravel and native PHP.
Checkout the RobotsTxt.php
class for a full understanding of the functionality.
This is fork of Robots package
As usual with Composer packages, there are two ways to install:
You can install via Composer. Pick the "master" as the version of the package.
composer require cybercog/robots-txt
Or add the following to your composer.json
in the require
section and then run composer update
to install it.
{
"require": {
"cybercog/robots-txt": "^1.0"
}
}
Once installed via Composer you need to add the service provider. Do this by adding the following to the 'providers' section of the application config (usually app/config/app.php
):
Cog\RobotsTxt\Providers\RobotsTxtServiceProvider::class,
The quickest way to use Robots is to just setup a callback-style route for robots.txt
in your /app/routes.php
file.
<?php
Route::get('robots.txt', function() {
// If on the live server, serve a nice, welcoming robots.txt.
if (App::environment() == 'production')
{
RobotsTxt::addUserAgent('*');
RobotsTxt::addSitemap('sitemap.xml');
} else {
// If you're on any other server, tell everyone to go away.
RobotsTxt::addDisallow('*');
}
return Response::make(RobotsTxt::generate(), 200, array('Content-Type' => 'text/plain'));
});
Add a rule in your .htaccess
for robots.txt
that points to a new script/template/controller/route/etc.
The code would look something like:
<?php
use Cog\RobotsTxt\RobotsTxt;
$robotsTxt = new RobotsTxt();
$robotsTxt->addUserAgent('*');
$robotsTxt->addSitemap('sitemap.xml');
header("HTTP/1.1 200 OK");
echo $robotsTxt->generate();
And that's it! You can show different robots.txt
files depending on how simple or complicated you want it to be.
Please refer to CONTRIBUTING.md for information on how to contribute to RobotsTxt and its related projects.
The RobotsTxt library is an open-sourced software licensed under the MIT.