I have a problem! I have a backup script in python. It backups all the folders in /var/www/ into different .tar.7z for each folder inside /var/www/
The problem is that the compression time is very slow. And for 4GB big folders it stops compressing sometimes at 1G sometimes at 1,5GB.
This is the row for this compression:
os.system("tar cf - -C %s . 2>/dev/null 3>/dev/null | 7za a -p%s -si %s 1>/dev/null 2>/dev/null 3>/dev/null" % (cf, self.config.get(jn, "archpass"), filename))
When I try to tar -cf compress-dir.tar /var/www/bigsite.com/
the 4GB folder, it runs creates the .tar extremely quickly in few minutes it is ready.
However within the python script the temporary file that is created as soon as the .tar is starting to be generated, increases in size very slowly. After about 10 minutes it reaches about 1GB... and soon it stops increasing, not showing any error in the console.
Is there a way I can simulate the same that is happeing here: tar cf - -C %s . 2>/dev/null 3>/dev/null
directly in bash?
Because clearly it's not the same as tar -cf compress-dir.tar /var/www/bigsite.com/
as it runs much faster.
Maybe if I run the tar directly in bash an error may appear. Of course if you have any other ideas, please let me know.
z
option. Gzip is not as efficient as 7zip regarding the compressed size but may be little bit faster. For the 7zip problem I would suspect an old or defect version of 7zip as 7zip should not have problems with large files. You also don't have to stick with 7zip, other compressors like bzip2 can also be used. And using certain parameters you can reduce the compression efficiency to speed up the backup process.