fabianmichael / kirby-meta

All-in-one solution to all of your SEO/OpenGraph/JSON-LD needs. 👀

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Unlisted pages: Meta-Robots always set to „noindex“

FynnZW opened this issue · comments

commented

No matter what the page settings are, the HTML tag created is always:
<meta name="robots" content="noindex" />

This also seems to affect all children of unlisted pages.
I use unlisted for internal reasons, but the page is still public and should be found by robots.

Version: 0.4.0-beta
Kirby: 3.9.4

commented

I just noticed this is by design. From the readme:
robots.index

If a page is excluded from the sitemap or unlisted, the robots meta tag will always contain noindex.

But in my opinion, that is wrong. I like using 'unlisted' for e.g. pages, that are not supposed to show up in a menu, but they are still public and might be important for the organic search.
Or just pages that don't need sorting (/example-page instead of /1_example-page, because having a stable folder name can be useful when syncing the content).
And it is also confusing to still have those options in the panel/meta-tab, but they don't have any effect.

@FynnZW Hi Fynn, thanks for your feedback. Unfortunately, I had to make a decision at some point and ended-up with excluding unlisted pages mostly because they are labelled "unlisted". I used to rely on Kirby’s sorting numbers in the past for building navigation menus, but nowadays I prefer to use a pages/structure field or similiar to that task. Just using the page status often caused confusion and is not very flexible. Given that fact, that this plugin is already used by more than just a few people, I cannot change the behavior at this point without introducing a major breaking change and a possibel security risk.

You can however use the sitemap.templates.includeUnlisted option to including unlisted pages as a workaround.