Skip to content

Commit

Permalink
Merge pull request #487 from six2dez/dev
Browse files Browse the repository at this point in the history
Hotfixes for amass, dns zone transfer and url filtering
  • Loading branch information
six2dez authored Mar 29, 2022
2 parents ca999a2 + 3d89bd8 commit 395adf6
Show file tree
Hide file tree
Showing 2 changed files with 29 additions and 23 deletions.
25 changes: 18 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@


<p align="center">
<a href="https://github.com/six2dez/reconftw/releases/tag/v2.2.1">
<img src="https://img.shields.io/badge/release-v2.2.1-green">
<a href="https://github.com/six2dez/reconftw/releases/tag/v2.2.1.1">
<img src="https://img.shields.io/badge/release-v2.2.1.1-green">
</a>
</a>
<a href="https://www.gnu.org/licenses/gpl-3.0.en.html">
Expand Down Expand Up @@ -69,7 +69,10 @@ So, what are you waiting for Go! Go! Go! :boom:
- [Main commands:](#main-commands)
- [How to contribute:](#how-to-contribute)
- [Need help? :information_source:](#need-help-information_source)
- [You can support this work buying me a coffee:](#you-can-support-this-work-buying-me-a-coffee)
- [Support this project](#support-this-project)
- [Buymeacoffee](#buymeacoffee)
- [DigitalOcean referral link](#digitalocean-referral-link)
- [GitHub sponsorship](#github-sponsorship)
- [Sponsors ❤️](#sponsors-️)
- [Thanks :pray:](#thanks-pray)
- [Disclaimer](#disclaimer)
Expand Down Expand Up @@ -450,6 +453,7 @@ reset='\033[0m'
- Metadata finder ([MetaFinder](https://github.com/Josue87/MetaFinder))
- Google Dorks ([degoogle_hunter](https://github.com/six2dez/degoogle_hunter))
- Github Dorks ([GitDorker](https://github.com/obheda12/GitDorker))

## Subdomains
- Passive ([amass](https://github.com/OWASP/Amass), [waybackurls](https://github.com/tomnomnom/waybackurls), [github-subdomains](https://github.com/gwen001/github-subdomains), [gau](https://github.com/lc/gau))
- Certificate transparency ([ctfr](https://github.com/UnaPibaGeek/ctfr))
Expand Down Expand Up @@ -544,16 +548,23 @@ If you want to contribute to this project you can do it in multiple ways:
- Check [FAQ](https://github.com/six2dez/reconftw/wiki/7.-FAQs) for commonly asked questions.
- Ask for help in the [Telegram group](https://t.me/joinchat/TO_R8NYFhhbmI5co)

## You can support this work buying me a coffee:

## Support this project

### Buymeacoffee
[<img src="https://cdn.buymeacoffee.com/buttons/v2/default-green.png">](https://www.buymeacoffee.com/six2dez)

### DigitalOcean referral link
<a href="https://www.digitalocean.com/?refcode=f362a6e193a1&utm_campaign=Referral_Invite&utm_medium=Referral_Program&utm_source=badge"><img src="https://web-platforms.sfo2.cdn.digitaloceanspaces.com/WWW/Badge%201.svg" alt="DigitalOcean Referral Badge" /></a>

### GitHub sponsorship
[Sponsor](https://github.com/sponsors/six2dez)

# Sponsors ❤️
**This section shows the current financial sponsors of this project**



[<img src="https://pbs.twimg.com/profile_images/1360304248534282240/MomOFi40_400x400.jpg" width="100" height=auto>](https://github.com/0xtavian)
[<img src="https://pbs.twimg.com/profile_images/1296513249702285312/fpHFDhyc_400x400.jpg" width="100" height=auto>](https://github.com/reconmap)
[<img src="https://pbs.twimg.com/profile_images/1221701173864017922/Wg_Q7HoD_400x400.jpg" width="100" height=auto>](https://github.com/r1p)

# Thanks :pray:
* Thank you for lending a helping hand towards the development of the project!
Expand Down
27 changes: 11 additions & 16 deletions reconftw.sh
Original file line number Diff line number Diff line change
Expand Up @@ -340,10 +340,11 @@ function sub_passive(){
start_subfunc ${FUNCNAME[0]} "Running : Passive Subdomain Enumeration"
if [ ! "$AXIOM" = true ]; then
amass enum -passive -d $domain -config $AMASS_CONFIG -json .tmp/amass_json.json 2>>"$LOGFILE" &>/dev/null
[ -s ".tmp/amass_json.json" ] && cat .tmp/amass_json.json | jq -r '.name' | anew -q .tmp/amass_psub.txt
else
axiom-scan $list -m amass -passive -json -o .tmp/amass_json.json $AXIOM_EXTRA_ARGS 2>>"$LOGFILE" &>/dev/null
axiom-scan $list -m amass -passive -o .tmp/amass_axiom.txt $AXIOM_EXTRA_ARGS 2>>"$LOGFILE" &>/dev/null
[ -s ".tmp/amass_axiom.txt" ] && cat .tmp/amass_axiom.txt | anew -q .tmp/amass_psub.txt
fi
[ -s ".tmp/amass_json.json" ] && cat .tmp/amass_json.json | jq -r '.name' | anew -q .tmp/amass_psub.txt
if [ -s "${GITHUB_TOKENS}" ]; then
if [ "$DEEP" = true ]; then
github-subdomains -d $domain -t $GITHUB_TOKENS -o .tmp/github_subdomains_psub.txt 2>>"$LOGFILE" &>/dev/null
Expand Down Expand Up @@ -584,19 +585,13 @@ function sub_recursive(){
start_subfunc ${FUNCNAME[0]} "Running : Subdomains recursive search"
# Passive recursive
if [ "$SUB_RECURSIVE_PASSIVE" = true ]; then
[ -s "subdomains/subdomains.txt" ] && ( cat subdomains/subdomains.txt | rev | cut -d '.' -f 3,2,1 | rev | sort | uniq -c | sort -nr | grep -v '1 ' | head -n 10 && cat subdomains/subdomains.txt | rev | cut -d '.' -f 4,3,2,1 | rev | sort | uniq -c | sort -nr | grep -v '1 ' | head -n 10 ) | sed -e 's/^[[:space:]]*//' | cut -d ' ' -f 2 > .tmp/subdomains_recurs_amass.txt
if [ ! "$AXIOM" = true ]; then
for sub in $( ( cat subdomains/subdomains.txt | rev | cut -d '.' -f 3,2,1 | rev | sort | uniq -c | sort -nr | grep -v '1 ' | head -n 10 && cat subdomains/subdomains.txt | rev | cut -d '.' -f 4,3,2,1 | rev | sort | uniq -c | sort -nr | grep -v '1 ' | head -n 10 ) | sed -e 's/^[[:space:]]*//' | cut -d ' ' -f 2);do
amass enum -passive -d $sub -config $AMASS_CONFIG 2>>"$LOGFILE" | anew -q .tmp/passive_recursive.txt
done
[ -s ".tmp/subdomains_recurs_amass.txt" ] && amass enum -passive -df .tmp/subdomains_recurs_amass.txt -config $AMASS_CONFIG 2>>"$LOGFILE" | anew -q .tmp/passive_recursive.txt
[ -s ".tmp/passive_recursive.txt" ] && puredns resolve .tmp/passive_recursive.txt -w .tmp/passive_recurs_tmp.txt -r $resolvers --resolvers-trusted $resolvers_trusted -l $PUREDNS_PUBLIC_LIMIT --rate-limit-trusted $PUREDNS_TRUSTED_LIMIT --wildcard-tests $PUREDNS_WILDCARDTEST_LIMIT --wildcard-batch $PUREDNS_WILDCARDBATCH_LIMIT 2>>"$LOGFILE" &>/dev/null
else
for sub in $( ( cat subdomains/subdomains.txt | rev | cut -d '.' -f 3,2,1 | rev | sort | uniq -c | sort -nr | grep -v '1 ' | head -n 10 && cat subdomains/subdomains.txt | rev | cut -d '.' -f 4,3,2,1 | rev | sort | uniq -c | sort -nr | grep -v '1 ' | head -n 10 ) | sed -e 's/^[[:space:]]*//' | cut -d ' ' -f 2);do
echo $sub | anew -q .tmp/sub_pass_recur_target.com
done
if [ -s ".tmp/sub_pass_recur_target.com" ]; then
axiom-scan .tmp/sub_pass_recur_target.com -m amass -passive -o .tmp/amass_prec.txt $AXIOM_EXTRA_ARGS 2>>"$LOGFILE" &>/dev/null
fi
find .tmp -type f -iname "*_prec.txt" -exec cat {} + | anew -q .tmp/passive_recursive.txt
[ -s ".tmp/subdomains_recurs_amass.txt" ] && axiom-scan .tmp/subdomains_recurs_amass.txt -m amass -passive -o .tmp/amass_prec.txt $AXIOM_EXTRA_ARGS 2>>"$LOGFILE" &>/dev/null
[ -s ".tmp/amass_prec.txt" ] && cat .tmp/amass_prec.txt | anew -q .tmp/passive_recursive.txt
[ -s ".tmp/passive_recursive.txt" ] && axiom-scan .tmp/passive_recursive.txt -m puredns-resolve -r /home/op/lists/resolvers.txt -o .tmp/passive_recurs_tmp.txt $AXIOM_EXTRA_ARGS 2>>"$LOGFILE" &>/dev/null
fi
fi
Expand Down Expand Up @@ -678,8 +673,8 @@ function subtakeover(){
function zonetransfer(){
if { [ ! -f "$called_fn_dir/.${FUNCNAME[0]}" ] || [ "$DIFF" = true ]; } && [ "$ZONETRANSFER" = true ] && ! [[ $domain =~ ^[0-9]+\.[0-9]+\.[0-9]+\.[0-9] ]]; then
start_func ${FUNCNAME[0]} "Zone transfer check"
dig axfr $domain @8.8.8.8 > subdomains/zonetransfer.txt
if [ -s ".tmp/zone_transfer.txt" ]; then
for ns in $(dig +short ns "$domain"); do dig axfr "$domain" @"$ns" >> subdomains/zonetransfer.txt; done
if [ -s "subdomains/zonetransfer.txt" ]; then
if ! grep -q "Transfer failed" subdomains/zonetransfer.txt ; then notification "Zone transfer found on ${domain}!" info; fi
fi
end_func "Results are saved in $domain/subdomains/zonetransfer.txt" ${FUNCNAME[0]}
Expand Down Expand Up @@ -1144,7 +1139,7 @@ function urlchecks(){
fi
fi
[ -s ".tmp/gospider.txt" ] && sed -i '/^.\{2048\}./d' .tmp/gospider.txt
[ -s ".tmp/gospider.txt" ] && cat .tmp/gospider.txt | grep -aEo 'https?://[^ ]+' | sed 's/]$//' | grep "$domain" | anew -q .tmp/url_extract_tmp.txt
[ -s ".tmp/gospider.txt" ] && cat .tmp/gospider.txt | grep -aEo 'https?://[^ ]+' | sed 's/]$//' | grep -E "^(http|https):[\/]{2}([a-zA-Z0-9\-\.]+\.$domain)" | anew -q .tmp/url_extract_tmp.txt
else
axiom-scan .tmp/webs_all.txt -m waybackurls -o .tmp/url_extract_way_tmp.txt $AXIOM_EXTRA_ARGS 2>>"$LOGFILE" &>/dev/null
[ -s ".tmp/url_extract_way_tmp.txt" ] && cat .tmp/url_extract_way_tmp.txt | anew -q .tmp/url_extract_tmp.txt
Expand All @@ -1160,7 +1155,7 @@ function urlchecks(){
[[ -d .tmp/gospider/ ]] && find .tmp/gospider -type f -exec cat {} + | sed '/^.\{2048\}./d' | anew -q .tmp/gospider.txt
fi
[[ -d .tmp/gospider/ ]] && NUMFILES=$(find .tmp/gospider/ -type f | wc -l)
[[ $NUMFILES -gt 0 ]] && cat .tmp/gospider.txt | grep -aEo 'https?://[^ ]+' | sed 's/]$//' | grep ".$domain" | anew -q .tmp/url_extract_tmp.txt
[[ $NUMFILES -gt 0 ]] && cat .tmp/gospider.txt | grep -aEo 'https?://[^ ]+' | sed 's/]$//' | grep -E "^(http|https):[\/]{2}([a-zA-Z0-9\-\.]+\.$domain)" | anew -q .tmp/url_extract_tmp.txt
fi
if [ -s "${GITHUB_TOKENS}" ]; then
github-endpoints -q -k -d $domain -t ${GITHUB_TOKENS} -o .tmp/github-endpoints.txt 2>>"$LOGFILE" &>/dev/null
Expand Down

0 comments on commit 395adf6

Please sign in to comment.