Skip to content

getting a list of URLs from a website

Use lynx -dump -listonly -nonumbers | grep -e barfoo > foobar.txt, where is the web page's URL, barfoo is a text string to base included URLs on, for example a file extension, and foobar.txt is the outputted list of URLs.


No rights reserved: CC0 1.0.

prior work

The -dump and -listonly options for Lynx were introduced to me by an answer on Stack Exchange by michas.