From aldeid
Jump to: navigation, search


DNSenum is a pentesting tool that enumerates as much DNS information about domains as possible.

The program currently performs the following operations:

  • Get the host's addresse (A record)
  • Get the namservers (threaded)
  • Get the MX record (threaded)
  • Perform axfr queries on nameservers (threaded)
  • Get extra names and subdomains via google scraping (google query = "allinurl: -www site:domain")
  • Brute force subdomains from file, can also perform recursion on subdomain that have NS records (all threaded)
  • Calculate C class domain network ranges and perform whois queries on them (threaded)
  • Perform reverse lookups on netranges ( C class or/and whois netranges) (threaded)
  • Write to domain_ips.txt file ip-blocks.



Install Perl

$ sudo apt-get install perl

Then install dependencies via CPAN:

$ sudo cpan
cpan[1]> install Getopt::Long
cpan[2]> install IO::File
cpan[3]> install Thread::Queue
cpan[4]> install Net::IP
cpan[5]> install Net::DNS
cpan[6]> install Net::Netmask
cpan[7]> install Net::Whois::IP
cpan[8]> install HTML::Parser
cpan[9]> install WWW::Mechanize


$ cd /data/src/
$ wget http://dnsenum.googlecode.com/files/dnsenum1.2.tar.gz
$ mkdir -p /pentest/enumeration/
$ tar xzvf dnsenum1.2.tar.gz -C /pentest/enumeration/
$ mv /pentest/enumeration/dnsenum1.2/ /pentest/enumeration/dnsenum/



dnsenum.pl [Options] <domain>


General options

--dnsserver <server>
Use this DNS server for A, NS and MX queries.
Shortcut option equivalent to --threads 5 -s 20 -w.
-h, --help
Print this help message.
Skip the reverse lookup operations.
Show and save private ips at the end of the file domain_ips.txt.
--subfile <file>
Write all valid subdomains to this file.
-t, --timeout <value>
The tcp and udp timeout values in seconds
(default: 10s).
--threads <value>
The number of threads that will perform different queries.
-v, --verbose
Be verbose: show all the progress and all the error messages.

Google scraping options

-p, --pages <value>
The number of google search pages to process when scraping names, the default is 20 pages, the -s switch must be specified.
-s, --scrap <value>
The maximum number of subdomains that will be scraped from google.

Brute force options

-f, --file <file>
Read subdomains from this file to perform brute force.
-u, --update <a|g|r|z>
Update the file specified with the -f switch with valid subdomains.
  • a: Update using all results.
  • g: Update using only google scraping results.
  • r: Update using only reverse lookup results.
  • z: Update using only zonetransfer results.
-r, --recursion
Recursion on subdomains, brute force all discovred subdomains that have an NS record.

Whois netrange options

-d, --delay <value>
The maximum value of seconds to wait between whois queries, the value is defined randomly
(default: 3s)
-w, --whois
Perform the whois queries on c class network ranges.
Warning: this can generate very large netranges and it will take lot of time to performe reverse lookups.

Reverse lookup options

-e, --exclude <regexp>
Exclude PTR records that match the regexp expression from reverse lookup results, useful on invalid hostnames.


This section/article is being written and is therefore not complete.
Thank you for your comprehension.


blog comments powered by Disqus