jekyll / jekyll-sitemap

Jekyll plugin to silently generate a sitemaps.org compliant sitemap for your Jekyll site

Home Page:http://rubygems.org/gems/jekyll-sitemap

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Overwriting existing robots.txt using jekyll-sitemap v1.4

danieldurrans opened this issue · comments

I am using jekyll-sitemap v1.4 with jekyll v4.2.1

In my source directory I have a robots.txt file. I have also tried the file in my _pages directory.

When I build the site, I receive the following error (paths redacted):

dan@Copper source % bundle exec jekyll serve --livereload
Configuration file: /source/_config.yml
            Source: /source
       Destination: /source/_site
 Incremental build: disabled. Enable with --incremental
      Generating... 
       Jekyll Feed: Generating feed for posts
          Conflict: The following destination is shared by multiple files.
                    The written file may end up with unexpected contents.
                    /source/_site/robots.txt
                     - robots.txt
                     - /source/robots.txt
                    
                    done in 4.249 seconds.

Therefore I am not sure that the bug identified here is fixed.

commented

@danieldurrans
Hey, did you find a work around for this? I am experiencing the same thing. Same versions for me as well.

Hey, did you find a work around for this? I am experiencing the same thing. Same versions for me as well.

My bug report is incorrect. This only happens for me when robots.txt is in my /source/_pages/ directory.

Make sure you haven't got a robots.txt in any subdirectory. Just put a copy in the root of your source directory and you should be good.

Note that you may want to add a Sitemap directive into the file.

User-agent: *
Allow: /

Sitemap: https://www.example.com/sitemap.xml
commented

Thanks for the reply. Mine is still acting strange. I have robots.txt in root of my site and it seems to work locally but not when I push it to my remote. Not sure what's going on. Probably something configured wrong on my end.

Update: Just an FYI...pretty sure my old robots.txt was being cached since I use cloudfare. So it was on my end. Just wasn't thinking. Force it to reload through the cloudfare cache configuration or just wait and it should update on its own within a day or something.

Same problem here. It would be very helpful to make a parameter to this plugin with the option to disable robots.txt creation.

This issue has been automatically marked as stale because it has not been commented on for at least two months.

The resources of the Jekyll team are limited, and so we are asking for your help.

If this is a bug and you can still reproduce this error on the master/main branch, please reply with all of the information you have about it in order to keep the issue open.

If this is a feature request, please consider whether it can be accomplished in another way. If it cannot, please elaborate on why it is core to this project and why you feel more than 80% of users would find this beneficial.

This issue will automatically be closed in two months if no further activity occurs. Thank you for all your contributions.