# $Id: robots.txt,v 1.9.2.1 2008/12/10 20:12:19 goba Exp $ # # robots.txt # # This file is to prevent the crawling and indexing of certain parts # of your site by web crawlers and spiders run by sites like Yahoo! # and Google. By telling these "robots" where not to go on your site, # you save bandwidth and server resources. # # This file will be ignored unless it is at the root of your host: # Used: http://example.com/robots.txt # Ignored: http://example.com/site/robots.txt # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/wc/robots.html # # For syntax checking, see: # http://www.sxw.org.uk/computing/robots/check.html User-agent: * Crawl-delay: 10 Sitemap: https://card.romanticcollection.ru/sitemap.xml # Directories Disallow: /includes/ Disallow: /images/ # Files Disallow: /advancedate.php Disallow: /login.php Disallow: /forum/viewtopic.php?p=* Disallow: /pickup.php?id=* Disallow: /done.php?card_id=* Disallow: /toprated.php?page=* Disallow: /member.php?* Disallow: /member.php?* Disallow: /newcards.php?page=* Disallow: /print.php?id=* Disallow: /gbrowse.php?cat_id=* Disallow: /create.php?card_id=* Disallow: /postme/gen.php* Disallow: /?* Clean-param: id&cat /mobile/*.php Clean-param: page&sortby /cat/ Host: https://card.romanticcollection.ru