0

For example I have a ubuntu server 20.04 with 200gb disk and 150gb of that storage is full.

And I want to backup the data in this system to a remote system.

  • Is there a way to push all the data to S3 async?
  • If I want to target a docker db and files inside a docker what should I do?
  • I already have a backup system but as in this example when the storage is not enough. process will be cut in half when the system tries to save files to local file system.

What ways do I have to be able to save my system to a remote system? I have more than 100 servers like this and some of the servers are like 50gb full in 200gb and some are 1.3tb full in 1.6tb. So because some servers are this way. I am asking here.

7
  • i'd try to free up the space first: docker system prune --all --force --volumes & apt-get clean, etc.. then worry about syncing files...
    – alexus
    Aug 3 at 14:07
  • @alexus My issue is I can't cleanup any space. Those files are our user's media files so There is no way I can delete them
    – Yusuf gndz
    Aug 3 at 14:48
  • First you determine how much total space you need for a backup of all your machines. Then you create a system which has that amount of total space free. Then you backup all files to that system. How hard can it be? I also don't see why local free space should be a limiting factor when doing a remote backup .
    – paladin
    Aug 3 at 14:52
  • hm you may use duplicacy to push from inside, or veeam can do the job from outside
    – djdomi
    Aug 3 at 15:26
  • 1
    @paladin many backup techniques rely on making a local copy first, compressing in place, and then moving the compressed files because they use gzip. @Yusuf gndz a simple option is to mount an external drive with NFS and then rsync to it.
    – tsc_chazz
    Aug 3 at 15:28

0

You must log in to answer this question.

Browse other questions tagged .