Very Huge Sized Cache of Duplicity

by Bahadir Tasdemir   Last Updated June 26, 2018 15:00 PM

I am currently using a duplicity script to backup my 110G/2T CentOS server to a 2T sftp server.

Since 4 days passed, duplicity backed up just 90G. This is not a problem. The main problem is; I have got nearly 600G cache that duplicity generated at "/user/.cache/duplicity". This size is not normal, so what should I do? Will duplicity shrink or remove these cache files and folders when finished the task? Will duplicity backup it's cache too (I did not exclude the /user folder)?

Additional info: I am using Hetzner server and this is the backup script that I am using: https://wiki.hetzner.de/index.php/Duplicity_Script/en

In addition, I just excluded directories proc, sys and dev from the root (and backed up everything else starting from the root, because I wanted a full server backup).



Answers 1


According to the mailing list

you will have to manually exclude it ..

it holds your backup chains index files (table of contents of the backup repository). Caching them locally accelerates options like status and incremental backup and others. These operations need to know what is already backed up to work. If they are cached they do not need to be transferred and decrypted every time again and again.

.. ede

For the rest it seems to be a long standing bug.

On the Debian bug tracker, they recommend to

duplicity cleanup --extra-clean --force ....

Warning: The suggested --extra-clean option is dangerous and can bite you very hard. It makes backups unrestorable by usual means.

Marki
Marki
April 06, 2016 22:54 PM

Related Questions


How to restore a Duplicity backup to a new host?

Updated March 03, 2017 15:00 PM

Duplicity lftp 530 error

Updated April 06, 2018 19:00 PM


Duplicity backup to S3: BackendException

Updated July 09, 2015 13:00 PM

Make Duplicity list files to be backed up?

Updated September 01, 2017 13:00 PM