# ======================================== # Robots.txt optimisé pour WikiMetz # ======================================== # Googlebot User-agent: Googlebot Crawl-delay: 1 # Bingbot User-agent: Bingbot Crawl-delay: 3 Disallow: /index.php?title=Spécial Disallow: /index.php?title=Special Disallow: /index.php?title=Sp%C3%A9cial # DuckDuckBot User-agent: DuckDuckBot Crawl-delay: 5 Disallow: /index.php?title=Spécial Disallow: /index.php?title=Special Disallow: /index.php?title=Sp%C3%A9cial # AhrefsBot User-agent: AhrefsBot Crawl-delay: 10 Disallow: /index.php?title=Spécial Disallow: /index.php?title=Special Disallow: /index.php?title=Sp%C3%A9cial # SemrushBot User-agent: SemrushBot Crawl-delay: 10 Disallow: /index.php?title=Spécial Disallow: /index.php?title=Special Disallow: /index.php?title=Sp%C3%A9cial # Bots agressifs User-agent: DotBot Disallow: / User-agent: BLEXBot Disallow: / User-agent: MJ12bot Disallow: / # Tous les autres bots User-agent: * Crawl-delay: 2 Disallow: /index.php?title=Spécial Disallow: /index.php?title=Special Disallow: /index.php?title=Sp%C3%A9cial Disallow: /load.php Disallow: /api.php