[wp-trac] [WordPress Trac] #50456: Multisite robots.txt files should reference all network XML sitemaps
WordPress Trac
noreply at wordpress.org
Tue Jun 23 21:55:33 UTC 2020
#50456: Multisite robots.txt files should reference all network XML sitemaps
----------------------------+------------------------------
Reporter: jonoaldersonwp | Owner: (none)
Type: defect (bug) | Status: new
Priority: normal | Milestone: Awaiting Review
Component: Sitemaps | Version:
Severity: normal | Resolution:
Keywords: seo | Focuses: multisite
----------------------------+------------------------------
Comment (by pbiron):
I've got a working patch, but this ''might'' be a problem for **large
networks**.
The largest network I have access to has a little more than 2000 sites.
With the patch applied, it takes an extra 14 seconds (on my local machine)
to generate `robots.txt` (because of the calls to
`switch_to_blog()/restore_current_blog()`, necessary to get the URLs for
the sitemap index for each sub-subsite). I can only imagine how much
extra time it would take for truly **large network** (e.g., 10,000+
sites).
Does anyone have an idea whether that extra time would cause problems for
consumers of `robots.txt`? That is, would they timeout trying to retrieve
it?
--
Ticket URL: <https://core.trac.wordpress.org/ticket/50456#comment:7>
WordPress Trac <https://core.trac.wordpress.org/>
WordPress publishing platform
More information about the wp-trac
mailing list