I thought i ll post this in web development...
Is there a way to find out all the links that a website has? What i mean, is there, for example, a tool to scan the website to get a txt. file with the tree-like diagram of all the links to other websites and so on...
Let me know if this doesn't make sense, i just don't know the official name name for what i wanna do.
Hierarchy of links
- floodhound2
- ∑lectronic counselor
- Posts: 2117
- Joined: 03 Sep 2006, 16:00
- 17
- Location: 127.0.0.1
- Contact:
- sternbildchen
- Fame ! Where are the chicks?!
- Posts: 421
- Joined: 26 Apr 2006, 16:00
- 17
- Location: Germany
I think the english name for it is "Enumeration".
There are a lot of tools the crawl the webpages.
#Free:
1) BurbSpider (just collect links)
#Not-Free:
2)Black Widow (collects links,downloads stuff,makes the page offline useable)
There are a lot of tools the crawl the webpages.
#Free:
1) BurbSpider (just collect links)
Code: Select all
http://portswigger.net/
2)Black Widow (collects links,downloads stuff,makes the page offline useable)
Code: Select all
http://www.softbytelabs.com/us/bw/index.html