From aldeid
Jump to navigation Jump to search
You are here
Directory listening


A web application or a web site often contains directories and sub directories. Some of them are listed as standard links in the application, enabling an easy navigation within the application. These files, directories and sub directories are "public". At the opposite, some directories must be kept secret since they are only accessible by a category of users (e.g. administrators). They are qualified as "private". If no specific protection is applied on these directories, they could be discovered by a hacker and browsed through the URL. This discovery process is called "directory listening".


Robots.txt or robot.txt files enable to control the indexation of web pages. Be careful not to specify hidden directories in it because it would provide a source of information to hackers. Rather prefer .htaccess protection.


Some web sites have a file named sitemap.xml, giving a list of all pages that have to be indexed by robots. Be careful about the content you put in it.




  • dirsearch (recommended)
  • Dirbuster is a brute-forcer that automatizes the discovery of hidden directories.
  • Wikto automatizes the discovery of hidden directories, based on a database of default directories, on bruteforce methods, and on Google Database.
  • nmap http-enum script
  • gobuster


This category currently contains no pages or media.