matt-clegg / robots-module

NuxtJS module for robots.txt

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

@nuxtjs/robots

npm version npm downloads Github Actions CI Codecov License

A Nuxt.js module that injects a middleware to generate a robots.txt file

Features

  • Nuxt 3 and Nuxt Bridge support
  • Generate robots.txt for static mode
  • Add middleware for robots.txt

Setup

  1. Add @nuxtjs/robots dependency to your project
yarn add @nuxtjs/robots # or npm install @nuxtjs/robots
  1. Add @nuxtjs/robots to the modules section of nuxt.config.js
export default {
  modules: [
    // Simple usage
    '@nuxtjs/robots',

    // With options
    ['@nuxtjs/robots', { /* module options */ }]
  ]
}

Using top level options

export default {
  modules: [
    '@nuxtjs/robots'
  ],
  robots: {
    /* module options */
  }
}

Options

configPath

  • Type: String
  • Default: robots.config

rules

  • Type: Object|Array
  • Default:
{
  UserAgent: '*',
  Disallow: ''
}

Robots config

If you need to use function in any rule, you need to create a config file through the configPath option

export default {
  UserAgent: '*',
  Disallow: '/',
      
  // Be aware that this will NOT work on target: 'static' mode
  Sitemap: (req) => `https://${req.headers.host}/sitemap.xml`
}

The keys and values available:

  • UserAgent = User-agent
  • CrawlDelay = Crawl-delay
  • Disallow = Disallow
  • Allow = Allow
  • Host = Host
  • Sitemap = Sitemap
  • CleanParam = Clean-param

Note: Don't worry, keys are parsed with case insensitivity and special characters.

License

MIT License

Copyright (c) - Nuxt Community

About

NuxtJS module for robots.txt

License:MIT License


Languages

Language:TypeScript 85.8%Language:JavaScript 11.1%Language:Vue 3.2%