There is such a method:

public boolean saveZip(File zipFile) { Log.d(LOG_TAG, zipFile.getAbsolutePath()); boolean result = false; if (!zipFile.exists()) zipFile.getParentFile().mkdirs(); try { ZipOutputStream zipOutputStream = new ZipOutputStream(new BufferedOutputStream(new FileOutputStream(zipFile))); JsonWriter writer = new JsonWriter(new OutputStreamWriter(zipOutputStream, "UTF-8")); // Запись таблицы DB_TABLE_NAME в архив Log.d(LOG_TAG, "ЗАПИСЬ: " + DB_TABLE_NAME + ".json"); Cursor cursor = getAllData(); // основная таблица данных; { ZipEntry entry = new ZipEntry(DB_TABLE_NAME + ".json"); zipOutputStream.putNextEntry(entry); writer.beginArray(); while (cursor.moveToNext()) { writer.beginObject(); writer.name(_ID) .value(cursor.getLong(cursor.getColumnIndex(_ID))); writer.name(COLUMN_PARENT) .value(cursor.getLong(cursor.getColumnIndex(COLUMN_PARENT))); writer.name(COLUMN_THING_NAME) .value(cursor.getString(cursor.getColumnIndex(COLUMN_THING_NAME))); writer.name(COLUMN_DESCRIPTION) .value(cursor.getString(cursor.getColumnIndex(COLUMN_DESCRIPTION))); writer.name(COLUMN_DATE) .value(cursor.getString(cursor.getColumnIndex(COLUMN_DATE))); writer.name(COLUMN_ISBOX) .value(cursor.getInt(cursor.getColumnIndex(COLUMN_ISBOX))); writer.endObject(); } writer.endArray(); cursor.close(); writer.close(); zipOutputStream.closeEntry(); } // Запись таблицы DB_TABLE_NAME_FOTOS в архив Log.d(LOG_TAG, "ЗАПИСЬ: " + DB_TABLE_NAME_FOTOS + ".json"); cursor = getAllDataNameFotos(); List<String> fotos = new ArrayList<>(); { ZipEntry entry = new ZipEntry(DB_TABLE_NAME_FOTOS + ".json"); zipOutputStream.putNextEntry(entry); writer.beginArray(); while (cursor.moveToNext()) { writer.beginObject(); writer.name(_ID) .value(cursor.getLong(cursor.getColumnIndex(_ID))); writer.name(COLUMN_ID_THING) .value(cursor.getLong(cursor.getColumnIndex(COLUMN_ID_THING))); writer.name(COLUMN_NAME_FOTO) .value(cursor.getString(cursor.getColumnIndex(COLUMN_NAME_FOTO))); fotos.add(cursor.getString(cursor.getColumnIndex(COLUMN_NAME_FOTO))); writer.name(COLUMN_DESCRIPTION_FOTO) .value(cursor.getString(cursor.getColumnIndex(COLUMN_DESCRIPTION_FOTO))); writer.name(COLUMN_DATE) .value(cursor.getString(cursor.getColumnIndex(COLUMN_DATE))); writer.endObject(); } cursor.close(); writer.endArray(); zipOutputStream.closeEntry(); } // Сохраняем фотки в архив { Log.d(LOG_TAG, "Сохраняем фото в архив"); for (String fotoName : fotos) { File fileFoto = new File(fotoDir, fotoName); if (fileFoto.isFile()) { ZipEntry entry = new ZipEntry(fileFoto.getName()); zipOutputStream.putNextEntry(entry); byte[] buffer = new byte[1024]; int length; InputStream inputStream = new FileInputStream(fileFoto); while ((length = inputStream.read(buffer)) > -1) { zipOutputStream.write(buffer, 0, length); } inputStream.close(); zipOutputStream.closeEntry(); } } } // запись таблицы DB_TABLE_NAME_ATTR Log.d(LOG_TAG, "ЗАПИСЬ: " + DB_TABLE_NAME_ATTR + ".json"); cursor = getAllDataNameAttr(); { ZipEntry entry = new ZipEntry(DB_TABLE_NAME_ATTR + ".json"); zipOutputStream.putNextEntry(entry); writer.beginArray(); while (cursor.moveToNext()) { writer.beginObject(); writer.name(_ID) .value(cursor.getLong(cursor.getColumnIndex(_ID))); writer.name(COLUMN_NAME_ATTR) .value(cursor.getString(cursor.getColumnIndex(COLUMN_NAME_ATTR))); writer.endObject(); } writer.endArray(); cursor.close(); zipOutputStream.closeEntry(); } // запись таблицы DB_TABLE_NAME_LIST_ATTR cursor = getAllDataNameListAttr(); { ZipEntry entry = new ZipEntry(DB_TABLE_NAME_LIST_ATTR + ".json"); zipOutputStream.putNextEntry(entry); writer.beginArray(); while (cursor.moveToNext()) { writer.beginObject(); writer.name(_ID) .value(cursor.getLong(cursor.getColumnIndex(_ID))); writer.name(COLUMN_ID_THING) .value(cursor.getLong(cursor.getColumnIndex(COLUMN_ID_THING))); writer.name(COLUMN_ID_ATTR) .value(cursor.getLong(cursor.getColumnIndex(COLUMN_ID_ATTR))); writer.name(COLUMN_DESCRIPTION_ATTR) .value(cursor.getString(cursor.getColumnIndex(COLUMN_DESCRIPTION_ATTR))); writer.endObject(); } writer.endArray(); cursor.close(); zipOutputStream.closeEntry(); } zipOutputStream.close(); } catch (IOException e){ Log.d(LOG_TAG, e.toString()); return result; } result = true; return result; } 

The problem is that the writer.close() method completely closes and zipOutputStream , and then the 'zipOutputStream.closeEntry ()' method does not work. And if you remove the writer.close() method, it works, but the files in the archive are empty.

  • one
    It may make sense to do writer.flush () before each zipEntry, and writer.close () before closing zipOutputStream.close () - a.chugunov

1 answer 1

As a result, I found two ways to solve the problem:

1) For each new json file, create a new writer object, at the end of the file creation use the writer.flush() method (as suggested in the comments), but in no case use the writer.close method.

2) Use your own class derived from ZipOutputStream . Here is what class I got:

  private class MyZipOutputStream extends ZipOutputStream{ public MyZipOutputStream(OutputStream os) { super(os); } @Override public void close() throws IOException { super.closeEntry(); } public void closeclose() throws IOException { super.close(); } } 

It turns out that the object closes the Entry when using the close() method. That is, when closing a writer , the full closure of 'zipOutputStream' does not occur. Only separately added the closeclose() method which will implement the complete closure of the zip file.