harlan-zw / nuxt-seo

The complete SEO solution for Nuxt.

Home Page:https://nuxtseo.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Groups don't work for Robots

mkummer225 opened this issue · comments

Describe the bug

Currently the nuxt.config.ts implementation for robots is not working as expected. It appears it may be merging the groups' disallow rules with the '*' user-agent's disallow rules.

To reproduce, within nuxt.config.ts:

robots: {
    disallow: ['/logout/','/profile/',...],
    groups: [{
        userAgents: ["gptbot"],
        disallow: ['/']
    }]
},

This is expected to produce a robots.txt that would only disallow those certain pages for all robots and disallow all pages for only gptbot – instead the robots.txt that's generated is:

# START nuxt-simple-robots (indexing disabled)
User-agent: *
Disallow: /

# END nuxt-simple-robots

Reproduction

No response

System / Nuxt Info

No response

Hi, sorry for the delay.

The issue is that you are using the key userAgents, the key should be userAgent. When no userAgent is provided it falls back to a wildcard user agent.

I notice that this is an issue with the documentation so I've pushed up a fix for that.