[wp-meta] [Making WordPress.org] #5740: Add /?s= disallow rule to robots.txt

Making WordPress.org noreply at wordpress.org
Thu May 20 07:41:23 UTC 2021


#5740: Add /?s= disallow rule to robots.txt
-----------------------------+---------------------
 Reporter:  jonoaldersonwp   |       Owner:  (none)
     Type:  task             |      Status:  new
 Priority:  high             |   Milestone:
Component:  General          |  Resolution:
 Keywords:  seo performance  |
-----------------------------+---------------------

Comment (by jonoaldersonwp):

 https://core.trac.wordpress.org/ticket/52457 is related, but isn't a
 solution here, and isn't the same thing.

 Why would I create a ticket with no reason?

 Noindex tags prevent indexing of an already-crawled URL. Robots.txt
 directives prevent crawling. They're different systems, with different
 effects.

 It's ''because'' our `?s` URLs redirect to `/search/` URLs that we have a
 problem - to the tune of ~500,000 spam URLs indexed in Google, damaging
 the WordPress brand, and consuming crawl budget which we desperately need
 elsewhere.

-- 
Ticket URL: <https://meta.trac.wordpress.org/ticket/5740#comment:7>
Making WordPress.org <https://meta.trac.wordpress.org/>
Making WordPress.org


More information about the wp-meta mailing list