Skip to content

Instantly share code, notes, and snippets.

View AkunWin's full-sized avatar

Akun AkunWin

View GitHub Profile
import xml.etree.ElementTree as ET
import urllib
import base64
import math
import sys
import re
# usage: Open Burp, navigate to proxy history, ctrl-a to select all records, right click and "Save Items" as an .xml file.
# python burplist.py burprequests.xml
# output is saved to wordlist.txt
@gwen001
gwen001 / wordgrab.sh
Last active March 11, 2022 03:01
create a wordlist from the target itself
#using cewl
wordgrab() {
url=$1
cewl.rb -u "Mozilla/5.0 (X11; Linux; rv:74.0) Gecko/20100101 Firefox/74.0" -d 0 -m 3 https://www.$1 | tr '[:upper:]' '[:lower:]' |sort -fu | grep -v "robin wood"
}
# added min length 3
wordgrab() {
url=$1
tmpfile="$(date "+%s")"
@gwen001
gwen001 / ejs.sh
Last active July 7, 2024 07:33
onliner to extract endpoints from JS files of a given host
curl -L -k -s https://www.example.com | tac | sed "s#\\\/#\/#g" | egrep -o "src['\"]?\s*[=:]\s*['\"]?[^'\"]+.js[^'\"> ]*" | awk -F '//' '{if(length($2))print "https://"$2}' | sort -fu | xargs -I '%' sh -c "curl -k -s \"%\" | sed \"s/[;}\)>]/\n/g\" | grep -Po \"(['\\\"](https?:)?[/]{1,2}[^'\\\"> ]{5,})|(\.(get|post|ajax|load)\s*\(\s*['\\\"](https?:)?[/]{1,2}[^'\\\"> ]{5,})\"" | awk -F "['\"]" '{print $2}' | sort -fu
# using linkfinder
function ejs() {
URL=$1;
curl -Lks $URL | tac | sed "s#\\\/#\/#g" | egrep -o "src['\"]?\s*[=:]\s*['\"]?[^'\"]+.js[^'\"> ]*" | sed -r "s/^src['\"]?[=:]['\"]//g" | awk -v url=$URL '{if(length($1)) if($1 ~/^http/) print $1; else if($1 ~/^\/\//) print "https:"$1; else print url"/"$1}' | sort -fu | xargs -I '%' sh -c "echo \"\n##### %\";wget --no-check-certificate --quiet \"%\"; basename \"%\" | xargs -I \"#\" sh -c 'linkfinder.py -o cli -i #'"
}
# with file download (the new best one):
# but there is a bug if you don't provide a root url
@screetsec
screetsec / gist:6ee948503960f1b9d4b7b8465aea2d73
Last active May 25, 2023 16:16
One Liner to get Hidden URL Parameter from Passive scan using Web Archive. Regex using DFA Engine, Support and Collecting URL with multi Parameter to Fuzzing & Removing Duplicate
curl -s "http://web.archive.org/cdx/search/cdx?url=*.bugcrowd.com/*&output=text&fl=original&collapse=urlkey" | grep -P "=" | sed "/\b\(jpg\|png\|js\|svg\|css\|gif\|jpeg\|woff\|woff2\)\b/d" > Output.txt ; for i in $(cat Output.txt);do URL="${i}"; LIST=(${URL//[=&]/=FUZZ&}); echo ${LIST} | awk -F'=' -vOFS='=' '{$NF="FUZZ"}1;' >> Passive_Collecting_URLParamter.txt ; done ; rm Output.txt ; sort -u Passive_Collecting_URLParamter.txt > Passive_Collecting_URLParamter_Uniq.txt
@hackerscrolls
hackerscrolls / extensions_temp_backup.txt
Created April 11, 2020 15:11
Common temp and backup extensions for files and directories by twitter.com/hackerscrolls
.0
.1
.2
.3
.tar
.tgz
.zip
.tar.gz
.rar
.cache
@si9int
si9int / c99-nl.py
Last active April 20, 2024 21:28
Automates https://subdomainfinder.c99.nl | Usage: python3 c99-nl.py <domain.com> | Requirements: pip3 install bs4
#!/usr/bin/env python3
# v.0.3 | twitter.com/si9int
import requests, sys
from bs4 import BeautifulSoup as bs
domain = sys.argv[1]
subdomains = []
def get_csrf_params():
csrf_params = {}
@nikallass
nikallass / check-smb-v3.11.sh
Created March 11, 2020 04:57
CVE-2020-0796. Scan HOST/CIDR with nmap script smb-protocols.nse and grep SMB version 3.11.
#!/bin/bash
if [ $# -eq 0 ]
then
echo $'Usage:\n\tcheck-smb-v3.11.sh TARGET_IP_or_CIDR'
exit 1
fi
echo "Checking if there's SMB v3.11 in" $1 "..."
nmap -p445 --script smb-protocols -Pn -n $1 | grep -P '\d+\.\d+\.\d+\.\d+|^\|.\s+3.11' | tr '\n' ' ' | replace 'Nmap scan report for' '@' | tr "@" "\n" | grep 3.11 | tr '|' ' ' | tr '_' ' ' | grep -oP '\d+\.\d+\.\d+\.\d+'
@dwisiswant0
dwisiswant0 / st8out.sh
Last active December 18, 2025 10:06
St8out - Extra one-liner for reconnaissance
#!/bin/bash
#####
#
# St8out - Extra one-liner for reconnaissance
#
# Usage: ./st8out.sh target.com
#
# Resources:
# - https://github.com/j3ssie/metabigor
@jhaddix
jhaddix / Github bash generated search links (from hunter.sh)
Created January 12, 2020 19:55
Github bash generated search links (from hunter.sh)
@gwen001
gwen001 / oparam.sh
Created December 5, 2019 14:15
onliner to extract params from url
echo "https://www.example.com/?aaa=bbb&ccc=ddd" | tr '?' '&' | awk -F '&' '{for(i=2;i<=NF;i++){split($i,t,"=");print t[1]}}'
while read u; do echo $u | tr '?' '&' | awk -F '&' '{for(i=2;i<=NF;i++){split($i,t,"=");print t[1]}}'; done < plainurls.txt | sort -fu
From wayback json file:
cat waybackurls.json|jq -r '.[]'|grep 'http'|cut -d '"' -f 2 | while read u; do echo $u | tr '?' '&' | awk -F '&' '{for(i=2;i<=NF;i++){split($i,t,"=");print t[1]}}'; done | sort -fu
function oparam {
echo $1 | tr '?' '&' | awk -F '&' '{for(i=2;i<=NF;i++){split($i,t,"=");print t[1]}}'
}