Thread:Benboy755/@comment-30992579-20170118230848/@comment-5238714-20170119000104

My knowledge in this field is extremely limited; however, if you want to generate a list of URLs for every page in a topic, you can do that quite easily with the following site:

http://textmechanic.com/text-tools/numeration-tools/generate-list-numbers/

Its workings are pretty straightforward and you can quite easily get what you need from it, I don't think I need to explain it in depth.

I don't know anything about making a script to extract the exact data you want without you having to do anything manually, but I'm sure you can find someone who can help you with that.

Also, I'd like a bit more information about what program you're using and how it works. For the past few months, I've been looking into making a script to extract data from all 2,435,697 old MB profile pages (as going through that many pages is impossible manually). It seems probable that I won't be able to get a script made in time to extract the data I'm looking for, so I'm considering a mass archival of all MB profile pages instead. Could your program handle archiving a few million profile pages?