Website admins would need to validate all URLs in the website from time to time. As it is near impossible to do it manually even for a medium-sized website, specialized tools can come in handy. Here are a few such tools which can do it very well. Both can crawl all the links of an website and generate reports on each link.
Linux only. webcheck seemed to give better results in my tests. However, it doesn’t support as many report formats as the next tool. The report is generated in a file named webcheck.dat which can be opened and viewed in any text editor. ccze can help with better report visibility to search for link errors. webcheck consumes considerable resource and bandwidth. Man page. Common usage:
$ webcheck http://www.yoursite.com/
Linux only. LinkChecker outputs the report in many formats. However, some results were not accurate. Few links which it detected as bad were healthy. However, it has a nice GUI too! Man page. Common usage:
$ linkchecker http://www.yoursite.com/ -F html/linkcheck.html
- Dead Link Checker
Probably the best in the list. Online service. Shows the texts for linked URLs so that it is easier to trace broken links.
- W3C Link Checker
Hosted by W3C. Web based and easy to use. It doesn’t consume your bandwidth to validate the links.
- Xenu (suggested by Jester Raiin)
For Windows. Works on Linux with Wine. Lightweight and no-frills GUI. Free but not open source.