Reconr is an automation tool designed to automate pentesting tasks. The core concept revolves around using workflow files, which define a list of tasks that can be applied to multiple targets. This approach simplifies repetitive pentesting processes and enhances efficiency.
This tool as been developed by Yann Grossenbacher while working for Syret Gmbh.
You will need to install Docker and Golang. I would also suggest to install jq for processing the output. If your installing it on Ubuntu, juste use this script
# Clone the repository
git clone https://github.com/your-username/reconr.git
# Change directory
cd reconr
# Install dependencies
go build -o reconr cmd/reconr/main.go- Edit config.yaml
- Run the workflow
./reconr -config ./config.yaml -workflow ./workflow.yaml- Check the resuts
cd out/[target.com]
cat subdomain
cat nuclei.txt | jq "[.info.name,.info.severity,.host]"---
workPath: "./out"
mountWork: "/mount"
logfile: "./log"
scopeFileName : "scope.txt"
target: "scanme.sh"
proxy: "http://127.0.0.1:8080"
configPath: "./config"
mountConfig: "/config"
# If you define no scope, there will be no scope validation
scope:
#- 192.168.10.0/24
#- 192.168.20.0-192.168.20.30The workPath will be the path where all the tools are running. The config path is a folder that can be used to store some wordlist or tools configuration files.
Example of a simple workflow. You can run multiple commands for every task. For every task, a new container will be created.
---
tasks:
1:
name: "subdomain"
commands:
- "subfinder -d {{ .Target }} -duc -cs -pc /config/subfinder-config.yaml -json -o subfinder.txt"
- "cat subfinder.txt | jq -r .host > subdomain.txt"
- "echo {{ .Target }} >> subdomain.txt"
2:
name: "validateSubdomain"
commands:
- "httpx -l subdomain.txt -duc -td -cdn -json -o httpx.txt"All the command will be executed by default in /mount. The folder that being mounted in the container. You can also use /config that will be the same for all the targets. So it's a great place to store wordlist or configurations files.
This is how the default workflow works :
The ip or ranges your defining in the config file will get copied in scope.txt. Example of using scope validation:
1:
name: "validateScope"
commands:
- "cat domains.txt | grepcidr -f scope.txt > validDomains.txt"For now, you can use two fields that will be processed: Target and Proxy. Both are defined in the configuration file.
subfinder -d {{ .Target }}
nuclei -l validPorts.txt -p {{ .Proxy }}
You can find the logs under /log/targetname.log. If you want to see the logs live, you can use tail. tail -n 10 -f target.com.log
If you want the get the logs of a currently running task, use the docker logs. docker logs reconrX -f X being the task number.
A lot of the tools have very interesting options that are hidden in the documentation. For the project discovery tools :
With the default workflow, a gf folder is created, and it's worth a look. You will find some interesting endpoint to dig further. You can also write your own pattern and add them in the config folder.
Nuclei is a great tool, but it's even better with your own templates. You can add templates in the config folder.
Jq is the perfect tool to find the interesting stuff in the json output. You're looking for Windows servers ?
cat httpx.txt | grep "Windows Server" | jq "[.tech,.host]"
You're looking for the urls that got an 500 error ?
cat httpx.txt | grep '"status_code":500' | jq .url
https://jqlang.github.io/jq/manual/

