Skip to content

solution-libre/traefik-plugin-robots-txt

Repository files navigation

Robots.txt Traefik plugin

Table of Contents

  1. Description
  2. Setup
  3. Usage
  4. Reference
  5. Development
  6. Contributors

Description

Robots.txt is a middleware plugin for Traefik which add rules based on ai.robots.txt or on custom rules in /robots.txt of your website.

Setup

Configuration

# Static configuration

experimental:
  plugins:
    example:
      moduleName: github.com/solution-libre/traefik-plugin-robots-txt
      version: v0.1.0

Usage

# Dynamic configuration

http:
  routers:
    my-router:
      rule: host(`localhost`)
      service: service-foo
      entryPoints:
        - web
      middlewares:
        - robots-txt

  services:
   service-foo:
      loadBalancer:
        servers:
          - url: http://127.0.0.1
  
  middlewares:
    robots-txt:
      plugin:
        traefik-plugin-robots-txt:
          aiRobotsTxt: true

Reference

Name Description Default value Example
aiRobotsTxt Enable the retrieval of ai.robots.txt list false true
additionalRules Add additianl rules at the end of the file \nUser-agent: *\nDisallow: /private/\n

Development

Solution Libre's repositories are open projects, and community contributions are essential for keeping them great.

Fork this repo on GitHub

Contributors

The list of contributors can be found at: https://github.com/solution-libre/traefik-plugin-robots-txt/graphs/contributors

About

Traefik plugin to create, overwrite or complete the robots.txt file

Topics

Resources

License

Stars

Watchers

Forks

Contributors 4

  •  
  •  
  •  
  •