There are so many links of the form

http://example.com/file001.xhtml http://example.com/file002.xhtml 

and so on.

How would you render them in a browser (for example, in Chrome) and save to PDF?
Hands through the "Print" is not an option, since there are quite a few links. It seems to be a very primitive task, but I can not figure out how best to deal with it?

    1 answer 1

    Is it important for you to transfer them to pdf or to get data from them for further work with them?

    If the latter, then Linux has utilities wget , which can download sites. If you jot down a script and give it a list of sites, then accordingly it will download them to you:

     wget -r -k -l 7 -p -E -nc $1 

    and give him a file with a list of sites.

    If it is important in pdf, then you send already ready-made index.html files to print

    • Well, download via wget is clear how. I need to convert it to pdf ... - Nikolay
    • Well, send index.html to print to a virtual printer. What is bad this way? - Dejsving
    • Those. merge everything into one file and then print on a virtual printer? Hmm, the idea is not bad, thanks, I'll try. - Nikolay
    • you can print each file separately and then connect already pdfs - maybe it will look better - Dejsving
    • There are more than a hundred of them, with your hands through the menu, everyone is unrealistic to print. And how to do it on the machine - I do not know. That's the question ... - Nikolay