mrkidd85 Posted September 14, 2011 Share Posted September 14, 2011 Is it possible to take all the pages from a website and make a spreadsheet out of the URLs and titles instead of having to manually go through the site and do it? Link to comment Share on other sites More sharing options...
Synook Posted September 14, 2011 Share Posted September 14, 2011 Not using HTML, however, it is perfectly possible to use a programming language to create a script that crawls a website, by using an SGML parser or regular expressions to extract the URLs of the links in each page, then visit each linked page and grab its title in a similar fashion, and so on until there are no more links to process, before compiling it all in a CSV or similar file. Are you familiar with any programming languages? Link to comment Share on other sites More sharing options...
mrkidd85 Posted September 14, 2011 Author Share Posted September 14, 2011 Afraid not mate, I'm just a humble SEO. Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.