I am writing not a big pre-loader on QT, it divides files into parts and downloads by throwing filename_part_1, filename_part_2 and so on to disk.

What is the fastest and most resource-intensive way to integrate them?

    1 answer 1

    Sketched a simple method of combining parts of files into one resulting file:

    bool mergeParts(const QString& resultFileName, const QStringList& partsFileNames) { qint64 totalSize = 0; QStringListIterator it(partsFileNames); while (it.hasNext()) { totalSize += QFile(it.next()).size(); } QFile result(resultFileName); result.resize(totalSize); if (result.open(QFile::WriteOnly)) { it.toFront(); while (it.hasNext()) { QFile part(it.next()); if (part.open(QFile::ReadOnly)) { result.write(part.readAll()); result.flush(); } } } } 

    It remains for you:

    • ensure that parts of files are transferred in the correct order;
    • check if there is enough disk space where you write the result;
    • handle errors when opening files;
    • make sure that the number of bytes read from the file-part is equal to the number of bytes written to the file-result;
    • ...
    • PROFIT

    Check how satisfied you are with the processing speed. Maybe it is worth saving on calls of constructors QFile ?

    • I don’t have any problems with writing a method to merge files, I’m looking for the most productive one at this stage, you unload everything into take.ms/52M68 memory, which I predict is not good ... - avengerweb
    • “All” is the full content of each piece individually. If you plan that these pieces will be very large, then who forbids you to read into a fixed-size buffer? You can even read / write at least byte-bye. - aleks.andr
    • Pieces of 500mb each. the question is how it will be more correct, and not how I can read them - avengerweb