Robots.txt is a middleware plugin for Traefik which add rules based on
ai.robots.txt or on custom rules in /robots.txt of your website.
# Static configuration
experimental:
plugins:
example:
moduleName: github.com/solution-libre/traefik-plugin-robots-txt
version: v0.1.0# Dynamic configuration
http:
routers:
my-router:
rule: host(`localhost`)
service: service-foo
entryPoints:
- web
middlewares:
- robots-txt
services:
service-foo:
loadBalancer:
servers:
- url: http://127.0.0.1
middlewares:
robots-txt:
plugin:
traefik-plugin-robots-txt:
aiRobotsTxt: true| Name | Description | Default value | Example |
|---|---|---|---|
| aiRobotsTxt | Enable the retrieval of ai.robots.txt list | false |
true |
| additionalRules | Add additianl rules at the end of the file | \nUser-agent: *\nDisallow: /private/\n |
Solution Libre's repositories are open projects, and community contributions are essential for keeping them great.
The list of contributors can be found at: https://github.com/solution-libre/traefik-plugin-robots-txt/graphs/contributors