[wp-trac] [WordPress Trac] #50456: Multisite robots.txt files should reference all network XML sitemaps
WordPress Trac
noreply at wordpress.org
Wed Jun 24 08:25:34 UTC 2020
#50456: Multisite robots.txt files should reference all network XML sitemaps
----------------------------+------------------------------
Reporter: jonoaldersonwp | Owner: (none)
Type: defect (bug) | Status: new
Priority: normal | Milestone: Awaiting Review
Component: Sitemaps | Version:
Severity: normal | Resolution:
Keywords: seo | Focuses: multisite
----------------------------+------------------------------
Comment (by jonoaldersonwp):
Hah, challenges galore!
In order:
- That long response time on v large networks is definitely going to cause
issues. I figure we can either cache the content/results of the sitemaps
to include in a non-expiring transient, or, set a sensible cutoff (100
sites?) at which point we don't try and run this process ''at all'' (on
the premise that such a large/complex network should really be running an
SEO plugin and/or caching plugin to manage this, and that this working on
''some'' small-to-mid-sized networks is better than none at all).
- When sitemaps are disabled on the main site but enabled on child sites,
that feels like an edge-case with intentful non-default behaviour; as such
they should be using a SEO plugin and/or custom robots.txt file.
- Disabled sitemaps on child sites is problematic, but not the end of the
world; a reference in a robots.txt file to a sitemap which doesn't exist
shouldn't cause too much harm, beyond some potential periodic querying of
that URL. Same as before, too - this is behaviour which overrides the
default, in which case, folks should be managing that with a plugin or
custom robots.txt file.
--
Ticket URL: <https://core.trac.wordpress.org/ticket/50456#comment:9>
WordPress Trac <https://core.trac.wordpress.org/>
WordPress publishing platform
More information about the wp-trac
mailing list