Linked and JS Discovery

Another way to widen our scope is to examine all the links of our main target. We can do this using Burp Suite Pro.

We can visit a seed/root and recursively spider all the links for a term with regex, examining those links... and their links, and so on... until we have found all sites that could be in our scope.

This is a hybrid technique that will find both roots/seeds and subdomains.

Turn off passive scanning

  • Set forms auto to submit (if you’re feeling frisky)

  • Set scope to advanced control and use “keyword” of target name (not a normal FQDN)

  • Walk+browse main site, then spider all hosts recursively!

Use advance controls before starting to spider

2SUk2zQXK3sNITXVmoH1pmaJ2A.avif

and here are the results.

vt9dNEAhuJyZj7FOwpsIJxN8UI.avif

After the 1st spider run we’ve now discovered a ton of linked URLs that belong to our project. 

Not only subdomains, but NEW seeds/roots (twtchapp.net, ext-twitch.tv, twitchsvc.net). 

We can also now spider these new hosts and repeat until we have Burp Spider fatigue.

n40Ue2FNvSg81O4hAz26w6hd4E.avif

Now that we have this data, how do we export it? 

Clumsily =(

  • Select all hosts in the site tree
  • In PRO ONLY right click the selected hosts
  • Go to "Engagement Tools” -> “Analyze target”
  • Save report as an html file

Copy the hosts from the “Target” section

lFt7WmbfjSHLnPPDuPb5Uo2hWzE.avif

Linked Discovery (with GoSpider or hakrawler)

Linked discovery really just counts on using a spider recursively. 

One of the most extensible spiders for general automation is GoSpider written by j3ssiejjj which can be used for many things and supports parsing js very well.

XEwCujR2PhIS3rkBE4Quj7aiw.avif

kg1phB2W2vVPQgS8iIjpj5qOWw4.avif

Subdomain Enumeration  (with SubDomainizer)

Subdomainizer by Neeraj Edwards is a tool with three purposes in analyzing javascript. It will take a page and

  • Find subdomains referenced in js files
  • Find cloud services referenced in js files
  • Use the Shannon Entropy formula to find potentially sensitive items in js files

It will take a single page and scan for js files to analyze.

By using these features and code examples, you can make the most of your Taskify dashboard and stay organized and productive.

LEVoQ88MyYrUfTd1dNoW82Ev3WA.avif

Subdomain Scraping

The next set of tools scrape domain information from all sorts of projects that expose databases of URLs or domains. 

New sources are coming out all the time so the tools must evolve constantly.

This is only a small list of sources. Many more exist.Infrastructure sources

48depk6iITJbPvzGxUoBWOkdU.avif

Search sources

UJrPdrv3AURqz0v7BFGUVM2TDc.avif

Certificate Search

cHbzyaDXi6Jyk8782EjMu5p7Jy8.avif

Security Sources

Tl7xdZ8M2sMiPdjmdgtY1b9j4.avif

Subdomain Scraping Example (Google)

g1ISB0HGpiHrdkAcvrwrPQPRDJc.avif

Subdomain Scraping (Amass)

For scraping subdomain data there are two industry-leading tools at the moment; Amass and Subfinder. They parse all the “sources” referenced in the previous slide and more. 

Amass has the most sources, extensible output, brute-forcing, permutation scanning, and a ton of other modes to do additional analysis of attack surfaces.

lMdYgvkhJpQHB1EOpBNOqrD11M.avif

Amass also correlates these scraped domains to ASNs and lists what network ranges they appeared in.If a new ASN is discovered you can feed it back to amass intel

nedAXYJYAwCG00AnYl0XCl0Z2w.avif

OSINT

The fastest way to obtain a lot of subdomains is search in external sources. The most used tools are the following ones (for better results configure the API keys):

subdomains bbot -t tesla.com -f subdomain-enum # subdomains (passive only) bbot -t tesla.com -f subdomain-enum -rf passive # subdomains + port scan + web screenshots bbot -t tesla.com -f subdomain-enum -m naabu gowitness -n my_scan -o .
amass enum [-active] [-ip] -d tesla.com amass enum -d tesla.com | grep tesla.com # To just list subdomains
Subfinder, use -silent to only have subdomains in the output ./subfinder-linux-amd64 -d tesla.com [-silent]
findomain, use -silent to only have subdomains in the output ./findomain-linux -t tesla.com [--quiet]
python3 oneforall.py --target tesla.com [--dns False] [--req False] [--brute False] run
assetfinder --subs-only <domain>
https://github.com/Screetsec/Sudomy
vita -d tesla.com
theHarvester -d tesla.com -b "anubis, baidu, bing, binaryedge, bingapi, bufferoverun, censys, certspotter, crtsh, dnsdumpster, duckduckgo, fullhunt, github-code, google, hackertarget, hunter, intelx, linkedin, linkedin_links, n45ht, omnisint, otx, pentesttools, projectdiscovery, qwant, rapiddns, rocketreach, securityTrails, spyse, sublist3r, threatcrowd, threatminer, trello, twitter, urlscan, virustotal, yahoo, zoomeye"

There are other interesting tools/APIs that even if not directly specialised in finding subdomains could be useful to find subdomains, like:

Get list of subdomains in output from the API ## This is the API the crobat tool will use curl https://sonar.omnisint.io/subdomains/tesla.com | jq -r ".[]"
curl https://jldc.me/anubis/subdomains/tesla.com | jq -r ".[]"
Get Domains from rapiddns free API rapiddns(){ curl -s "https://rapiddns.io/subdomain/$1?full=1" \ | grep -oE "[\.a-zA-Z0-9-]+\.$1" \ | sort -u } rapiddns tesla.com
Get Domains from crt free API crt(){ curl -s "https://crt.sh/?q=%25.$1" \ | grep -oE "[\.a-zA-Z0-9-]+\.$1" \ | sort -u } crt tesla.com
  • gau: fetches known URLs from AlienVault's Open Threat Exchange, the Wayback Machine, and Common Crawl for any given domain.
Get subdomains from GAUs found URLs gau --subs tesla.com | cut -d "/" -f 3 | sort -u
Get only subdomains from SubDomainizer python3 SubDomainizer.py -u https://tesla.com | grep tesla.com # Get only subdomains from subscraper, this already perform recursion over the found results python subscraper.py -u tesla.com | grep tesla.com | cut -d " " -f
Get info about the domain shodan domain <domain> # Get other pages with links to subdomains shodan search "http.html:help.domain.com"
export CENSYS_API_ID=... export CENSYS_API_SECRET=... python3 censys-subdomain-finder.py tesla.com
python3 DomainTrail.py -d example.com

This project offers for free all the subdomains related to bug-bounty programs. You can access this data also using chaospy or even access the scope used by this project

You can find a comparison of many of these tools here: https://blog.blacklanternsecurity.com/p/subdomain-enumeration-tool-face-off

Subdomain Bruteforce(dns bruteforce)

  • Dnsenum A multi-threaded Perl script to enumerate DNS information of a domain and to discover non-contiguous IP blocks.
dnsenum --threads 10 --dnsserver 8.8.8.8 --enum -p -o output.txt -f subdomains.txt example.com
  • Fierce

A reconnaissance tool that helps locate non-contiguous IP space and hostnames against specified domains.

fierce --domain example.com --wordlist subdomains.txt --threads 10 --output output.txt
  • dnsrecon

A Python script for performing DNS enumeration, including standard record lookup, zone transfers, and brute-forcing subdomains.

dnsrecon -d example.com -D subdomains.txt -t brt -o output.txt
  • dnscan A Python-based tool designed to brute-force subdomains using a wordlist.
dnscan.py -d example.com -w subdomains.txt -o output.txt -t 10
  • sublist3r

A Python tool designed to enumerate subdomains using various search engines and brute-forcing.

sublist3r -d example.com -b -t 10 -o output.txt -w subdomains.txt
  • amass

An advanced open-source tool for network mapping and attack surface discovery.

amass enum -d example.com -brute -w subdomains.txt -o output.txt
  • massdns

A high-performance DNS stub resolver for bulk resolution and subdomain enumeration.

./bin/massdns -r resolvers.txt -t A -w output.txt -o S -s 10 example_subdomains.txt
  • gobuster

A tool used to brute-force URLs and DNS subdomains.

gobuster dns -d example.com -w subdomains.txt -t 10 -o output.txt
  • enum4linux A Linux enumeration tool for Windows/Samba environments that can also perform DNS enumeration.
enum4linux -D example.com | tee output.txt
  • theHarvester

A tool designed to gather emails, subdomains, hosts, employee names, and open ports from different public sources (search engines, PGP key servers, and more).

theHarvester -d example.com -b all -f output.txt -w subdomains.txt
  • subfinder

A subdomain discovery tool that uses passive online sources and can also perform brute-forcing.

subfinder -d example.com -w subdomains.txt -o output.txt -t 10
  • crt.sh

A tool to query the Certificate Transparency logs for subdomains of a domain.

crtsh -d example.com -w subdomains.txt -t 10 -o output.txt
  • altdns

A tool that allows for the discovery of subdomains by generating permutations, alterations, and mutations.

altdns -i subdomains.txt -o data_output -w words.txt -r -s output.txt
  • findomain

A tool to find subdomains by querying multiple sources simultaneously.

findomain -t example.com -w subdomains.txt -u output.txt
  • dnsbrute

A simple DNS brute-force tool written in Python.

dnsbrute -d example.com -w subdomains.txt -t 10 -o output.txt

These tools should help with DNS enumeration and brute-forcing using a wordlist, and the outputs are directed to output.txt for your convenience. Make sure to replace example.com and subdomains.txt with the actual domain and wordlist file you intend to use.

Which wordlist 📄 to use?

The whole effort of DNS brute force is a waste if you don't use a good subdomain brute-forcing wordlist. Selection of the wordlist is the most important aspect of brute forcing. Let's have a look at some great wordlists:-

1. Assetnote best-dns-wordlist.txt (9 Million) ⭐ Assetnote wordlists are the best. No doubt this is the best subdomain brute-forcing wordlist. But it is highly recommended that you run this in your VPS. Running on a home system will take hours also the results wouldn't be accurate. This wordlist will definitely give you those hidden subdomains.

2. n0kovo n0kovo_subdomains_huge.txt (3 Million) N0kovo created this wordlist by scanning the whole IPv4 and collecting all the subdomain names from the TLS certificates. You can check out this blog to see how good this brute-forcing wordlist performs as compared to other big wordlists. So, if you are target contains a lot of wildcards this would be the best wordlist for brute forcing (considering the computation bottleneck for wildcard filtering).

3 Smaller wordlist (102k ) Created by six2dez is suitable to be run if you are using your personal computer which is consuming your home wifi router internet.

4. Jhaddix’s all.txt

5. Daniel Miessler’s DNS Discovery

6. Commonspeak2

7. rajesh6927/subdomain-bruteforce-wordlist

Subdomain Automation scripts(includes DNS zone transfer and Google dorking)

If you are a bug bounty hunter or a tester, you use the script below to passively enumerate subdomains.

#!/bin/bash
# Enhanced Passive Enumeration Bash Script
# Target domain
read -p "Enter the target domain: " targetDomain
echo "Performing enhanced passive enumeration on $targetDomain"
# Using dig for DNS information gathering
echo "Gathering DNS records..."
dig @$targetDomain ANY +noall +answer
# ENUMERATION SECTION
# Subdomain enumeration with Subfinder
echo "Enumerating subdomains with Subfinder..."
subfinder -d $targetDomain -o subfinder_subs.txt
echo "Subfinder results saved to subfinder_subs.txt"
# Subdomain enumeration with Amass
echo "Enumerating subdomains with Amass..."
amass enum -d $targetDomain -o amass_subs.txt
echo "Amass results saved to amass_subs.txt"
# Subdomain enumeration with Sublist3r
echo "Enumerating subdomains with Sublist3r..."
sublist3r -d $targetDomain -o sublist3r_subs.txt
echo "Sublist3r results saved to sublist3r_subs.txt"
# Subdomain enumeration with Knockpy
echo "Enumerating subdomains with Knockpy..."
knockpy $targetDomain -o knockpy_subs.csv
echo "Knockpy results saved to knockpy_subs.csv"
# WAYBACK MACHINE HISTORICAL DATA
echo "Searching Wayback Machine for historical data..."
wget -O wayback-data.txt "https://web.archive.org/cdx/search/cdx?url=*$targetDomain*&output=text"
echo "Historical data saved to wayback-data.txt"
# Display completion message
echo "Enhanced passive enumeration completed successfully."

2. Active Enumeration

Active enumeration methods involve direct interaction with the target's domain system. This could include sending DNS or HTTP requests to the target’s servers. Tools like Sublist3r, DNSMap, or Amass perform DNS queries to find subdomains actively. While this method can yield more comprehensive results compared to passive techniques, it also increases the risk of detection. In an era where organizations are constantly on the alert for cyber threats, using active enumeration requires careful timing and justification.

Let’s walk through a simple bash script that utilizes Nmap, a powerful tool for network discovery and security auditing. This script aims to discover live hosts and scan for open ports on a target domain.

Note: This script is for educational purposes; always seek permission before testing on live environments.

#!/bin/bash
# Enhanced active enumeration script using naabu and nmap.
if [ "$#" -ne 1 ]; then
echo "Usage: $0 <target_domain_or_IP>"
exit 1
fi
TARGET=$1
OUTPUT_DIR=$(mktemp -d -t enum-XXXXXXXXXX) # Creating a temporary directory for scan outputs.
echo "Output files will be saved in $OUTPUT_DIR"
# Stage 1: Fast Port Scanning with naabu.
echo "[*] Performing fast port scan with naabu on $TARGET..."
naabu -host $TARGET -o $OUTPUT_DIR/naabu_output.txt
# Extracting open ports for further scanning.
OPEN_PORTS=$(cat $OUTPUT_DIR/naabu_output.txt | cut -d ":" -f2 | tr '\n' ',' | sed 's/,$//')
if [ -z "$OPEN_PORTS" ]; then
echo "No open ports found. Exiting."
exit 1
fi
echo "[*] Found open ports: $OPEN_PORTS"
# Stage 2: Detailed Scan with nmap.
echo "[*] Performing detailed scan with nmap on discovered ports..."
nmap -p $OPEN_PORTS -sV $TARGET -oN $OUTPUT_DIR/nmap_detailed_scan.txt
echo "Enumeration completed. Check $OUTPUT_DIR for detailed scan results."
# Cleanup
# Optionally, remove the temporary directory if you don't need the scan results.
# rm -rf $OUTPUT_DIR

3. Brute-Force Enumeration

This technique involves systematically guessing subdomain names and checking if they exist. Utilizing wordlists containing common subdomain names, tools like fierce or Knock can automate the process. As this method generates a high volume of DNS queries, it's easily detectable and might not always be efficient against well-configured rate limiting and monitoring defences. However, with the growing sophistication of bot detection and blocking mechanisms, brute-force enumeration tools have also evolved, now incorporating delay tactics and user-agent spoofing to mimic legitimate traffic.

checkout for more payloads here.

save the bash script as name.sh and then give executable rights chmod +x name.sh

#!/bin/bash
# Script for brute-force subdomain enumeration using fierce
# Make sure the user provides a domain name to scan
if [ "$#" -ne 1 ]; then
echo "Usage: $0 <domain>"
exit 1
fi
DOMAIN=$1
# Specify a wordlist. Adjust the path according to your setup.
# You can find numerous wordlists in tools like SecLists (https://github.com/danielmiessler/SecLists)
WORDLIST="/path/to/your/subdomains/wordlist.txt"
echo "Starting brute-force subdomain enumeration for $DOMAIN using fierce"
echo "This might take a while..."
# Run fierce with delay options to mimic legitimate traffic, reducing the chance of blocking.
# Remove '--delay' option or adjust it as per rate-limiting policies of the target domain if needed.
fierce --domain $DOMAIN --wordlist $WORDLIST --delay 3
echo "Enumeration completed."

4. DNS Zone Transfer

DNS Zone Transfer (AXFR) is a type of active enumeration where the tester attempts to request a copy of the entire DNS zone for a domain. This is only possible if the DNS servers are misconfigured to allow such transfers to unauthorized users. Although this vulnerability is less common now due to better default configurations and awareness, it remains a critical check during a VAPT exercise. In today's environment, where misconfigurations are a leading cause of data breaches, verifying DNS zone transfer settings is integral.

The script first ensures you’ve provided a domain and its DNS server, then sets up a timestamped output directory to avoid overwriting past results. It performs DNS Zone Transfer checks using three separate functions for dig, host, and nslookup, with each function’s output saved in a unique text file within the designated directory for easy identification and analysis.

1 . Prepare the Script:

  • Save the script to a file, for example as dns_zone_transfer_check.sh
  • Make the script executable by running chmod +x dns_zone_transfer_check.sh in your terminal.

2. Execute the Script:

  • Run the script with the domain and DNS server as arguments:
  • ./dns_zone_transfer_check.sh example.com ns.example.com
  • Replace example.com and ns.example.com with your target domain and its authoritative DNS server, respectively.
#!/bin/bash
# Comprehensive DNS Zone Transfer check using multiple tools.
if [ "$#" -ne 2 ]; then
echo "Usage: $0 <Domain> <DNS Server>"
exit 1
fi
DOMAIN=$1
DNSSERVER=$2
TIMESTAMP=$(date +%Y%m%d-%H%M%S)
OUTPUT_DIR="DNS_Zone_Transfer_$TIMESTAMP"
mkdir -p $OUTPUT_DIR
# Function to perform DNS Zone Transfer using dig
function check_dig {
echo "1. Using dig for DNS Zone Transfer..."
dig @$DNSSERVER $DOMAIN AXFR > "$OUTPUT_DIR/dig_$DOMAIN.txt"
echo "Done. Output saved to $OUTPUT_DIR/dig_$DOMAIN.txt"
echo "---------------------------------------------------------------------"
}
# Function to perform DNS Zone Transfer using host
function check_host {
echo "2. Using host for DNS Zone Transfer..."
host -l $DOMAIN $DNSSERVER > "$OUTPUT_DIR/host_$DOMAIN.txt"
echo "Done. Output saved to $OUTPUT_DIR/host_$DOMAIN.txt"
echo "---------------------------------------------------------------------"
}
# Function to perform DNS Zone Transfer using nslookup
function check_nslookup {
echo "3. Using nslookup for DNS Zone Transfer..."
echo -e "server $DNSSERVER\nls -d $DOMAIN" | nslookup > "$OUTPUT_DIR/nslookup_$DOMAIN.txt"
echo "Done. Output saved to $OUTPUT_DIR/nslookup_$DOMAIN.txt"
echo "---------------------------------------------------------------------"
}
# Running all checks
echo "DNS Zone Transfer checks for $DOMAIN using server $DNSSERVER..."
echo "====================================================================="
check_dig
check_host
check_nslookup
echo "All checks complete. Review the outputs in the $OUTPUT_DIR directory."

5. Certificate Transparency Logs

Certificate Transparency logs are an increasingly popular source for subdomain enumeration. These publicly available logs are meant to prevent SSL certificates from being issued for a domain without the domain owner’s knowledge. Tools that search these logs, such as crt.sh, can reveal subdomains that have had SSL certificates issued, offering an updated view of an organization's internet-facing assets. With the rising enforcement of HTTPS and the proliferation of SSL certificates, leveraging CT logs for enumeration aligns with current security best practices.

Explanation:

    1. Checking Input: The script starts by checking if you’ve provided a domain as an argument. If not, it prompts you to use the correct usage format and exits.
    1. Setting Variables: The DOMAIN variable holds the domain to query. The USER_AGENT string is set to identify the script as a legitimate browser session while making requests. This helps in case certain services have checks to block non-browser user agents.
    1. Enumeration Process:
  • It utilizes curl to query crt.sh's JSON output for certificates related to the domain. The -s flag silences curl's progress output, and -A specifies the user agent.

  • The output is filtered through jq (a lightweight and flexible command-line JSON processor) to extract the domain names from each certificate entry.

  • The sed command strips any wildcard entries (e.g., *.example.com), as these are not specific subdomains.

  • The sort -u command sorts the output and removes any duplicate entries.

Requirements:

  • You must have jq installed on your system to process JSON output (sudo apt-get install jq on Debian/Ubuntu or brew install jq on MacOS).

  • Ensure curl is installed (usually pre-installed on most UNIX systems).

Usage: To use the script, save it to a file, make it executable with chmod +x scriptname.sh, and run it with a domain as an argument:

./scriptname.sh example.com

#!/bin/bash
# Check if domain is provided
if [ -z "$1" ]; then
echo "Usage: $0 <domain>"
exit 1
fi
DOMAIN=$1
# Define user-agent to use with curl for querying crt.sh
USER_AGENT="Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.3"
# Query crt.sh for subdomains & clean up the output
echo "Enumerating subdomains for $DOMAIN from crt.sh..."
curl -s -A "$USER_AGENT" "https://crt.sh/?q=%.$DOMAIN&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g' | sort -u
echo "Enumeration complete."

6. Dorking from google

Google Dorking leverages advanced search operators in Google to unearth information that might not be readily available through regular searches. It's like using special keywords and commands to unlock hidden corners of the internet. This technique can be used for good, like security professionals like finding subdomains and also to find hidden files and directories.

This code scans a domain for potential subdomains. It checks common prefixes (mail, ftp) and uses Google search (be mindful of terms!). It avoids duplicate entries and prints the findings.

To run it:

  • Save the code as a file (e.g., scan.sh).
  • Open a terminal and navigate to the file location (use cd).
  • Make the script executable with chmod +x scan.sh.
  • Run it with ./scan.sh. Enter your target domain (e.g., example.com) when prompted.
#!/bin/bash
# Prompt user for target domain
read -p "Enter target domain (e.g., example.com): " target
# Remove temporary files (if any) silently
rm -rf subdomain_scan_tmp1.txt subdomain_scan_tmp.txt &> /dev/null
# Extract the first part of the domain (without TLD)
sfe=$(echo "$target" | cut -d "." -f 1)
# Set a loop duration (adjust as needed)
runtime="40 seconds"
endtime=$(date -ud "$runtime" +%s)
# Loop for the specified duration
while [[ $(date -u +%s) -le $endtime ]]; do
# Start with "www" subdomain
echo -n "+-www" > subdomain_scan_tmp.txt
# Loop through common subdomain prefixes (add more as needed)
for prefix in mail ftp blog shop; do
potential_subdomain="$prefix.$target"
# Check if the subdomain resolves to an IP (basic validation)
if host -t A "$potential_subdomain" &> /dev/null; then
echo "+-$potential_subdomain" >> subdomain_scan_tmp.txt
fi
done
# Use Google search for additional discovery (limited accuracy)
# Be aware of Google's terms of service
search_url="https://www.google.com/search?q=site:$target"
user_agent="Mozilla/5.0 (Windows NT 10.0; Win64; x64)"
# Download search results silently and filter for potential subdomains
potential_subdomains=$(curl -s -A "$user_agent" "$search_url" | grep -Eo "(http|https)://[^/]+" | grep -i "$target")
# Extract subdomain names and remove duplicates
for subdomain in $potential_subdomains; do
subdomain_name=$(echo "$subdomain" | cut -d '/' -f 3 | cut -d '.' -f 1)
if [[ "$subdomain_name" != "$sfe" ]]; then
echo "+-$subdomain_name" >> subdomain_scan_tmp.txt
fi
done
# Sort and remove duplicates from the temporary file
cat subdomain_scan_tmp.txt | tr '+-' '\n' | sort -u | sed 's/^/+-/' | tr -d '\n' > subdomain_scan_tmp1.txt
# Move temporary file content to the main temporary file
mv subdomain_scan_tmp1.txt subdomain_scan_tmp.txt
done
# Print discovered subdomains (remove leading "+" sign)
cat subdomain_scan_tmp.txt | tr '+-' '\n' | sed 's/^+//'
# Remove temporary files again
rm -rf subdomain_scan_tmp1.txt subdomain_scan_tmp.txt

The Final Work!

This script incorporates functions for each enumeration technique: passive, active, brute-force, DNS zone transfer, and Google Dorking. It accepts the target domain and wordlist as inputs and then sequentially executes each enumeration technique.

Ensure that you have separate scripts for each technique (e.g., certificate_transparency_logs.sh, active_enumeration.sh, brute_force_enumeration.sh, dns_zone_transfer_check.sh, and google_dorking.sh), and they are executable and located in the same directory as the main script.

To use the script:

  • Save the main script and each individual enumeration script in separate files.
  • Make all the scripts executable (e.g., chmod +x script_name.sh).
  • Run the main script (./main_script.sh).
  • Follow the prompts to enter the target domain and wordlist path.
#!/bin/bash
# Function for passive enumeration
passive_enum() {
targetDomain=$1
echo "Performing passive enumeration for $targetDomain..."
# Passive enumeration using certificate transparency logs
./certificate_transparency_logs.sh "$targetDomain"
echo "Passive enumeration completed."
}
# Function for active enumeration
active_enum() {
target=$1
echo "Performing active enumeration for $target..."
# Active enumeration using naabu and nmap
./active_enumeration.sh "$target"
echo "Active enumeration completed."
}
# Function for brute-force enumeration
brute_force_enum() {
target=$1
wordlist=$2
echo "Performing brute-force enumeration for $target using $wordlist..."
# Brute-force enumeration using fierce
./brute_force_enumeration.sh "$target" "$wordlist"
echo "Brute-force enumeration completed."
}
# Function for DNS zone transfer
dns_zone_transfer() {
domain=$1
dnsServer=$2
echo "Performing DNS zone transfer check for $domain using DNS server $dnsServer..."
# DNS zone transfer check
./dns_zone_transfer_check.sh "$domain" "$dnsServer"
echo "DNS zone transfer check completed."
}
# Function for Google Dorking
google_dorking() {
target=$1
echo "Performing Google Dorking for $target..."
# Google Dorking
./google_dorking.sh "$target"
echo "Google Dorking completed."
}
# Main function
main() {
# Accept target domain and wordlist as inputs
read -p "Enter the target domain: " targetDomain
read -p "Enter the wordlist path: " wordlist
# Perform enumeration techniques
passive_enum "$targetDomain"
active_enum "$targetDomain"
brute_force_enum "$targetDomain" "$wordlist"
dns_zone_transfer "$targetDomain" "8.8.8.8" # Change DNS server as needed
google_dorking "$targetDomain"
}
# Execute main function
main

VHosts / Virtual Hosts

If you found an IP address containing one or several web pages belonging to subdomains, you could try to find other subdomains with webs in that IP by looking in OSINT sources for domains in an IP or by brute-forcing VHost domain names in that IP.

OSINT

You can find some VHosts in IPs using HostHunter or other APIs.

Brute Force

If you suspect that some subdomain can be hidden in a web server you could try to brute force it:

ffuf -c -w /path/to/wordlist -u http://victim.com -H "Host: FUZZ.victim.com" gobuster vhost -u https://mysite.com -t 50 -w subdomains.txt wfuzz -c -w /usr/share/wordlists/SecLists/Discovery/DNS/subdomains-top1million-20000.txt --hc 400,404,403 -H "Host: FUZZ.example.com" -u http://example.com -t 100 #From https://github.com/allyshka/vhostbrute vhostbrute.py --url="example.com" --remoteip="10.1.1.15" --base="www.example.com" --vhosts="vhosts_full.list" #https://github.com/codingo/VHostScan VHostScan -t example.com

Subdomain Discovery Workflow(using trickest)

Check this blog post I wrote about how to automate the subdomain discovery from a domain using Trickest workflows so I don't need to launch manually a bunch of tools in my computer:

Monitorization

You can monitor if new subdomains of a domain are created by monitoring the Certificate Transparency Logs sublert does, to configure the entire workflow and notification in slack or telegram see this writeup

Aquatone for screenshots

Aquatone was one of the more popular HTTP Screenshot tools in previous years, originally created in Ruby and then ported over to golang several years ago. Similar to gowitness, Aquatone’s most notable feature is the ability to categorize similar assets based on their image differential using context differentials. Aquatone also attempts to fingerprint hosts with their technology stack using Wappalyzer. https://github.com/michenriksen/aquatone

there are also other tools - EyeWitness, HttpScreenshot, Aquatone, Shutter, Gowitness or (webscreenshot)(https://github.com/maaaaz/webscreenshot).

Public cloud Assets

  • Identifying cloud assets can be done by looking at IP addresses and the certificates they return.
  • Filtering down potential targets to bounty targets is important.
  • Scan diffs and fingerprinting can be helpful for finding new targets and identifying specific services.

There are many online tools available on GitHub for discovering the S3 bucket associated with a website

S3Scanner: https://github.com/sa7mon/S3Scanner

Mass3: https://github.com/smiegles/mass3

slurp: https://github.com/0xbharath/slurp

Lazy S3: https://github.com/nahamsec/lazys3

bucket_finder: https://github.com/mattweidner/bucket_finder

AWSBucketDump: https://github.com/netgusto/awsbucketdump

sandcastle:  https://github.com/0xSearches/sandcastle

Dumpster Diver: https://github.com/securing/DumpsterDiver

S3 Bucket Finder:  https://github.com/gwen001/s3-buckets-finder

Credentials Leak

The best tool in the market with no doubt is truffle securitySearching for company.com probably won't provide useful results: many companies release audited open-source projects that aren't likely to contain secrets. Less-used domains and subdomains are more interesting. This includes specific hosts like jira.company.com as well as more general second-level and lower-level domains. It's more efficient to find a pattern than a single domain: corp.somecompany.comsomecompany.net, or companycorp.com are more likely to appear only in an employee's configuration files.

Leaks usually fall into one of these categories (ranked from most to least impactful):

  1. SaaS API keys - Companies rarely impose IP restrictions on APIs. AWS, Slack, Google, and other API keys are liquid gold. These are usually found in config files, bash history files, and scripts.

  2. Server/database credentials - These are usually behind a firewall, so they're less impactful. Usually found in config files, bash history files, and scripts.

  3. Customer/employee information - These hide in XLSX, CSV, and XML files and range from emails all the way to billing information and employee performance reviews.

  4. Data science scripts - SQL queries, R scripts, and Jupyter projects can reveal sensitive information. These repos also tend to have "test data" files hanging around.

  5. Hostnames/metadata - The most common result. Most companies don't consider this a vulnerability, but they can help refine future searches

Dorks can also be created to target specific API providers and their endpoints. This is especially useful for companies creating automated checks for their users' API keys. With knowledge of an API key's context and syntax, the search space can be significantly reduced.

With knowledge of the specific API provider, we can obtain all of the keys that match the API provider's regex and are in an API call context and then we can check them for validity using an internal database or an API endpoint.

jOVL5qkseBnrH83poQV9hoILmI.avif

A workflow for finding secrets for a single API provider

For example, suppose a company (HalCorp) provides an API for users to read and write to their account. By making our own HalCorp account, we discover that API keys are in the form [a-f]{4}-[a-f]{4}-[a-f]{4}.

# Python import halapi api = halapi.API() api.authenticate_by_key('REDACTED') # REST API with curl curl -X POST -H "HALCorp-Key: REDACTED" https://api.halcorp.biz/userinfo

Armed with this information, we can compose our own GitHub dorks for HalCorp API responses:

# Python "authenticate_by_key" "halapi" language:python # REST API "HALCorp-Key"

With a tool like GitHound, we can use regex matching to find strings that match the API key's regex and output them to a file:

echo "HALCorp-Key" | git-hound --dig-files --dig-commits --many-results --regex-file halcorp-api-keys.txt --results-only > api_tokens.txt

Now that we have a file containing potential  API tokens, and we can check these against a database for validity (do not do this if you don't have written permission from the API provider).

In the case of HalCorp, we can write a bash script that reads from stdin, checks the api.halcorp.biz/userinfo endpoint, and outputs the response.

cat api_tokens.txt | bash checktoken.bash

Full Recon Automatic Tools

There are several tools out there that will perform part of the proposed actions against a given scope.

References

All free courses of @Jhaddix like The Bug Hunter's Methodology v4.0 - Recon Edition

https://trufflesecurity.com/

https://tillsongalloway.com/finding-sensitive-information-on-github/index.html

https://book.hacktricks.xyz/