Message ID | 20240301133724.835666-1-mark@klomp.org |
---|---|
State | New |
Headers | show |
Series | [COMMITTED,htdocs] robots.txt: Disallow various wiki actions | expand |
On Fri, 1 Mar 2024, Mark Wielaard wrote: > It is fine for robots to crawl the wiki pages, but they should perform > actions, generate huge diffs, search/highlight pages or generate > calendars. s/should/should not/ :-) I see your patch does exactly that - thank you! Gerald
diff --git a/htdocs/robots.txt b/htdocs/robots.txt index 057c5899..36be4d13 100644 --- a/htdocs/robots.txt +++ b/htdocs/robots.txt @@ -14,4 +14,8 @@ Disallow: /bugzilla/show_bug.cgi*ctype=xml* Disallow: /bugzilla/attachment.cgi Disallow: /bugzilla/showdependencygraph.cgi Disallow: /bugzilla/showdependencytree.cgi +Disallow: /wiki/*?action=* +Disallow: /wiki/*?diffs=* +Disallow: /wiki/*?highlight=* +Disallow: /wiki/*?calparms=* Crawl-Delay: 60