Skip to content

ctrlsam/cred-spider

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Cred Spider

Cred Spider is a tool designed to quickly scrape HTTP-hosted website content for secrets.

Table of Contents

Getting Started

Follow the instructions below to set up your environment, configure the tool, and run Cred Spider.

Prerequisites

  • Masscan:
    Masscan is required to scan networks and generate a list of available websites for scraping.

Configuration

A default configuration file is available in the config directory. The file is named default.yml. You can modify it as needed to suit your environment.

Adjusting Ulimit

Before running Cred Spider, you must increase your system's ulimit (the maximum number of open file handles) to accommodate a large number of concurrent connections. For example, to set the limit to 5000, run:

ulimit -n 5000

Obtaining IP Addresses

Use Masscan to generate a list of IP addresses and save them in a grepable format that Cred Spider can parse. Here is an example command:

sudo masscan 0.0.0.0/0 \
    -p80,8080,8000,8888,3000,4000,5000 \
    --excludefile config/massscan_exclude.conf \
    -oG ips.txt
  • Note: Adjust the IP range, ports, and exclusion file as needed for your network.

Running the Application

Cred Spider can be executed in two modes:

Development Mode

For development and testing, you can run the application using Cargo:

cargo run ips.txt

Production Build (Recommended)

For improved performance, build and run the application in release mode:

  1. Build the release version:

    cargo build --release
  2. Run the executable:

    target/release/cred-spider ips.txt

About

A tool designed to quickly scrape HTTP-hosted website content for secrets.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages