Jump to content

What backup software?


Guest asrequested

Recommended Posts

CBers

What you mean by housekeep?

Delete stuff that you don't want, or need.

Link to comment
Share on other sites

CharleyVarrick

Ah, yes of course. But I don't want my backup to delete the copy automatically, if that's why you're asking.

Scenario: I'm not home, or sleeping, or drunk, or all of the above (this has happened already).

Suddenly, drive X at home goes bye-bye. Automatic backup kicks in and says: Cool! I don't see drive X, I'll just delete "copy of Drive X" then.

Hence I want exclusive control on what gets deleted on the backup. 

Edited by jlr19
Link to comment
Share on other sites

CBers

Agree, which is why you need to fully understand how the backup solution works from end to end and how it reacts when problems arise :)

Link to comment
Share on other sites

CharleyVarrick

About all those software I've listed earlier (page3), I'm finding dealbreakers left right and center.

It's MUCH harder than I thought to find a good great backup client that's compatible with Amazon Drive.

 

As much as Amazon unlimited Cloud seemed like a good idea, its just storage space and still requires a backup brain with all the right features for my specific needs.  

 

The main contestants for Online Backup with unlimited storage are:

 

1) BackBlaze

2) Carbonite

3) CrashPlan (but its slowness all but eliminates it)   

 

EDIT: Found other unlimted Online backup services:

4) OpenDrive

5) LiveDrive

6) Jungle Disk

7) Jottacloud

 

 

I'll go through this comparison and read all the fine print. :blink:

http://www.pcmag.com/article2/0,2817,2288745,00.asp

Edited by jlr19
Link to comment
Share on other sites

CharleyVarrick

Backblaze deletes after 30 days but sends a warning beforehand.

Thanks for the heads up

Link to comment
Share on other sites

aptalca

Duplicati  is also a relatively new app for backups. Pretty versatile. You can send encrypted backups to other computers or to a number of cloud storage providers.

 

Some people I know switched to it from Crashplan. I'm currently testing it.

 

Here's our docker image for it if you want to try it out in a sand-boxed environment: https://hub.docker.com/r/linuxserver/duplicati/

Link to comment
Share on other sites

CharleyVarrick

I tried Duplicati for real in the last few days, speed I was getting was pretty bad.

Was the bottleneck caused by their software or Amazon Cloud, I'll never know...

 

Depending on physical location, an individual might get decent speed and the other not.

 

Things it has going for it is its free and Open Source. Web interface is modern.

I was getting significantly better upload speed to Amazon with ARQ5, but their Windows GUI felt dated and clunky, plus its 50$.

Edited by jlr19
Link to comment
Share on other sites

casminkey

Good thread!

 

I saw someone mention Backblaze with a Windows 2012 server... I don't think Backblaze lets you run it on a server OS, unless they changed that recently. I signed up for Backblaze about 2 years ago and tried installing it on Windows 2012 and told me it wouldn't install on a server OS - and I got the same message when I opened up a support ticket. So I had to install Windows 10.

 

Other than that, I'm very happy with Backblaze. I mainly wanted something to backup family pictures and videos. They are replicated to a backup drive but worried about them being lost if house caught fire... or alien abduction, as previously mentioned. I eventually added my Emby library to the mix and let it run.

 

I did think it was odd that it took me like 8 months to back up my library (About 3.5TB at the time) with a 25Mbps upload connection running 24/7... seems awfully long. But now that I'm backed up, it stays in sync and I don't even notice it anymore. It just works.

Link to comment
Share on other sites

CharleyVarrick

I did think it was odd that it took me like 8 months to back up my library (About 3.5TB at the time) with a 25Mbps upload connection running 24/7... seems awfully long. But now that I'm backed up, it stays in sync and I don't even notice it anymore. It just works.

Initial backup upload time is a universal issue it seems, some fare a little better than others. Even a gigabit connection won't do any magic when the server throttles the connection on their end (need of the many...).

 

In a perfect world, the data center would be located nearby and you could bring those hard drive(s) in for copying on theirs.

Edited by jlr19
Link to comment
Share on other sites

CharleyVarrick

I am testing a backup service right now, they're report 6mbps upload, with 67 days for a 500gb backup.

It is frustrating to look at task mgr network usage graph hovering around 0.6%.

I then send a 1gb file from my main pc to server and network usage jumped to 67% (but only for the 15 second the file operation lasted).

 

EDIT: Sorry, my bad, the 67 days ETA was for 4.5Tb of data from a 6Tb drive, everything else stands.

Without any throttling, it would take 5 or 6  days.

Edited by jlr19
Link to comment
Share on other sites

MSattler

So a couple of things, I can still usually get through 200-600Gb of upload a day using ARQ5 and Amazon Cloud Drive.  Now, keep in mind when you are scanning through 6,10,50TB of data, it has to scan through those files and check the blocks to ensure they are backed up.   So if you restarted the services often, it's going to take even longer.

 

One thing I have noticed is that it appears cpu speed/memory/disk io have  a direct effect on how fast you can upload.   Case in point, I was running my backup VM on a ESXi server with 3 x 3TB drives setup in Raid 5.  Since I have moved the VM to a 4 x 2TB Raid 10 volume, with 2 Raid 1 SSD Disks being used for reach and write cache on a QNAP TS-1635.   I can tell you my backup uploads has definitely improved.

 

The other thing, I can almost guarantee that some of the backup pods that Amazon uses will be slower than others depending on consumption.  I'm guessing that you get routed to the closest storage pod near you, to keep latency down.

Edited by MSattler
Link to comment
Share on other sites

MSattler

There is stuff on my server that is personal and irreplaceable that is in the cloud. Of most of my other media I have watched if I loose that, I will try to get it back when I remember or when I need it, 8 years retention on Usenet, 6 years retention on my favorite P2P trackers will do the rest to restore what I want back. The other media that I have yet to watch can't be all that important.

 

In some of the forums that I hang out people lost HDDs and friends in those forums posted some of it password protected on Usenet, so either way I think I am covered for catastrophic failures.

 

 

You are forgetting the first rule of Usenet......

  • Like 1
Link to comment
Share on other sites

MSattler

Seeing how right now the ARQ5/Amazon combo doesn't cut it, I went on the Backblaze website and they have a speedtest to provide an ETA.

My ISP plan being 120/20, I was expecting similar results: Download is similar (and irrelevent) but upload barely reaches 5 or 6 mbps, with 80ms latency.  That would give me about 65g/day. Having maybe 27Tb to do, I'm looking at least 18 months.

I then went to speedtest.net for a 2nd opinion and got 130 down an 22 up (my isp does gives a 10% overhead). Lends me to believe that the backup servers I'm uploading to are too far away and/or throttling.

 

EDIT: I stopped (if I can call it that) ARQ5/Amazon, then setup CrashPlan and started a trial backup (just 1x 6tb drive with about 4.5Tb of stuff on it) to their cloud: I'm getting 3mbps with a 73 day ETA. About as bad as years ago. Back then I had a 30/10 plan. I'll let it run overnight and look at the ETA tomorrow morning.

 

What good is unlimited backup and decent internet plan when data trickles in to their data centers like this  ? :unsure:

 

Look at my most recent post.  What are the specs on your box doing the backup?   The disk speeds of the Unraid servers don't matter so much.  I am willing to bet your are using 5400/7200 single disks for your backup server.   Out of curiosity do you have a machine with a SSD you can run it on?

 

ARQ reads the blocks, and then it encrypts them, I am betting it's writing it to disk, and not keeping the encrypted blocks in memory.   Then ARQ reads the blocks and uploads them as it has bandwidth available.  Now if your writes and reads are slow, or cpu/memory for the encryption bit... I'm guessing it would slow it down.

Edited by MSattler
Link to comment
Share on other sites

MSattler

Oh, sorry, I haven't had my 2nd cup of jo yet. :P

That's a very good question you're asking me, which I have given little thought to yet.

 

I would venture it would be a mix of full and latest:

Media files could be just latest, as they're (usually) stable, at least the main file. nfo seems to be updated all the time (judging by my own experience with Emby)

I'm not sure I need 15 nfo versions in backup, not that they take up much place.

 

Not sure how this feature is called, but if I delete (voluntarily or not) the original file, I don't wan't the backup copy to be automatically deleted as well.

Otherwise, what's the point in backup, right?

 

Other files types could/should be full (versioning)

 

 

I think it depends on what you are using.  ARQ5 I believe does do versioning, so if you have had multiple copies of the movie, while backups where running it would back them both up.

Link to comment
Share on other sites

CharleyVarrick

I think it depends on what you are using.  ARQ5 I believe does do versioning, so if you have had multiple copies of the movie, while backups where running it would back them both up.

AFAIK, versioning means if one of the source file that's been already backed up is modified in some way, the backup would make a new copy, instead of overwriting the 1st backup (even when it's the same exact filename in the same path). While not necessarily a must-have feature for typical stable video files, it's a great feature for general documents and files that gets edited more-or-less regularly. So if Emby edit your .nfo file 12 times, you'd get 12 versions of the .nfo in your backup.

Edited by jlr19
Link to comment
Share on other sites

CharleyVarrick

I am willing to bet your are using 5400/7200 single disks for your backup server. Out of curiosity do you have a machine with a SSD you can run it on?

My media disks are all WD greens, granted they're no SSD, then again I don't know any people that store multi-Tb's of media files on SSD's.

Even if I did a test backup off of the OS drive (which is a SSD) that would result in faster upload, I wonder how I could use that in real life situation.

 

EDIT: Just thought of a valuable test I could do: test ARQ5 by backing up a test folder to another pc on my LAN. Same software (minus Amazon Cloud), same hardware (minus internet). If its fast this way, problem would be with Amazon and/or my physical location to them.

I will set this up later on and report back.

Edited by jlr19
Link to comment
Share on other sites

  • 2 months later...
Guest asrequested

I've just started testing Syncback Pro, and I'm really liking it. I'm still learning, but so far, my experience has been very favorable.

Link to comment
Share on other sites

Tur0k

I try to ensure that I am protecting my network from all angles.  for me this means:

 

A. backup processes

B. recovery processes (must be inline with backup methods). 

C. fault tolerance (of systems and storage)

D. security

     1. Network traffic A/V

     2. System installed A/V

     3. Web-filtering

     4. IDS

     5. Deny inbound/outbound traffic to/from malicious sources on public Internet. 

 

I work in the IT field so I look at backups with an eye for:
A. what happens if a system fails?
B. What happens if my network falls victim to a malware or viral outbreak.
C. What happens if my network falls victim to a direct attack.
D. What happens if my house burns down in flames.
E. What happens if my world region is hit with an asteroid.

I try to account for A-D. If E happens I am likely not alive, and if I am, I need to focus on surviving famine, infectious outbreaks, lizard alien invaders, fighting skynet, etc. 

 

I designed my backups to: 

A. Supply me with an OS image I can restore to. 

B. Supply me with personal files backup.

C. Fault Tolerance for non-critical media files

 

Currently, I don't design for 24x7 up-time.  I expect that recovery and restores will take me some time to complete. 

 

I use Clonezilla, Acronis true image, wind32diskimager, DD, and windows 7 backup as the software platforms in my backup plan.

 

I split my backup methodology into 4 focus system types:

1. High delta computer systems (computer systems where my family makes our personal data)

2. Static computer systems (systems that offer a service only)

3. NAS Shares (places where we save personal data that we want to share with each other) 

4. Smart devices.  (These often fit somewhere between static and high delta systems, most of the data here isn't high value though as it is often copied from our other storage systems).

 

All my high delta computer systems (my desktop, my wife's desktop, HTPC-server, and laptop) have base images from when I first built the system.  This would restore the system to its last known good configuration.  If I add software I have to re-image the system.

Then for personal files, I perform weekly full backups and daily incremental backups to the NAS.  

 

The other systems that remain static (wifi controller, HTPC-2, HTPC-3 firewall, HA server) have images of them made once built.  They don't have personal files and thus don't receive backups for that.

 

Smart phones backup to the main HTPC-server and are then backed up to the NAS on the personal file backups.  

 

I run a 4 bay NAS in a raid config.  this system provides backup and fault tolerance and network accessible shares to my systems.  I split the space into:

1.  user share volume (user folders, shared media).

2.  server backup shares volume

3.  secure backup storage volume

Currently, my HTPC-Server is setup with a Media Volume which consists of a pair of 4TB 7200 RPM spinner drives configured in RAID1.  videos I wouldn't want to lose (home videos, etc.) are hosted on my NAS. 

 

the current design would allow me to:

A.  Rebuild a high delta system from scratch in the event of system hardware failure or irreversible malware infection and restore my personal files to it.  the only at risk objects would be the files that were not backed up since the last incremental/full backup. 

B.  Restore a corrupt file if it was backed up at the last scheduled backed point. 

C.  Rebuild a Static computer system from scratch in the event of system hardware failure or irreversible malware infection.  The only loss would be log information. 

D.  not lose my media in the event of a single drive failure.  the risk is what if I lose both drives in the RAID1 group. 

 

I have a Secure backup storage volume on my NAS that has no shares available to the network and is really only accessible to the local system's account that I move backup content over to at regular interval.  I want to automate this process and need to finish hacking the OS and get to the point that I can run scripts from the shell on schedule.  I would also probably need to expand the available memory.  the way i would automate the secure backup process would be to:

1.  Zip the user shares daily and save them to the secure backup volume nightly, I will plan on keeping backups for the last 7 days then 3 weekly backups from Monday (least activity day).  

2.  Copy each server's full back and each incremental backup to the secure backup volume.  

3.  Script automatic deletions of the last backup for a given server's data worked in order to keep only a month of backups, as this design is pretty storage intensive.

 

This design change will help me automate the way I address the risks a cryptolocker attack could have on my backup volumes. 

 

The last component I need to worry about as it relates to backup is an automated offsite backup routine.  Currently, a buddy of mine and I are working to host FTPS sites and then upload encrypted compressed backup data to each other's storage solutions.

 

Sent from my iPhone using Tapatalk

Edited by Tur0k
  • Like 1
Link to comment
Share on other sites

Guest asrequested

Lol... And damn! Back to lol

 

I mean.....good lord! All my stuff is just fun, for me. If I have to start over, and rebuild, that's just even more fun. I'm just taking care of the basics. But who knows, the way I'm going, it may take on a life of it's own, lol

Edited by Doofus
Link to comment
Share on other sites

BAlGaInTl

Wow.

 

I do something similar, but haven't put quite that much thought in to it. Great info.

 

A question about this:

 

The last component I need to worry about as it relates to backup is an automated offsite backup routine. Currently, a buddy of mine and I are working to host FTPS sites and then upload encrypted compressed backup data to each other's storage solutions.

 

Sent from my iPhone using Tapatalk

Are you using any software to automate this? I just started playing with Duplicati, and really like it.

 

Right now, I'm only using it for critical user files, and sync the encrypted files to various cloud storage.

 

I'd love to hear what you're using.

 

Sent from my A0001 using Tapatalk

Link to comment
Share on other sites

mastrmind11

Wow.

 

I do something similar, but haven't put quite that much thought in to it. Great info.

 

A question about this:

 

 

Are you using any software to automate this? I just started playing with Duplicati, and really like it.

 

Right now, I'm only using it for critical user files, and sync the encrypted files to various cloud storage.

 

I'd love to hear what you're using.

 

Sent from my A0001 using Tapatalk

Yeah I second this.  I do manual imaging of my server os drives using tar, but would love to find something that can do it for me automatically.

Link to comment
Share on other sites

Tur0k

I am classically trained so I'm a command line cowboy. I am using p7zip to compress and encrypt the file in Linux I generally build scripts. Currently, I am working on pushing the compressed and encrypted files using FileZilla.

 

Duplicati looks amazing. How easy is it to schedule tasks? It says it supports FTP but does it support FTP over SSL?

 

 

Sent from my iPhone using Tapatalk

Link to comment
Share on other sites

BAlGaInTl

I am classically trained so I'm a command line cowboy. I am using p7zip to compress and encrypt the file in Linux I generally build scripts. Currently, I am working on pushing the compressed and encrypted files using FileZilla.

 

Duplicati looks amazing. How easy is it to schedule tasks? It says it supports FTP but does it support FTP over SSL?

 

 

Sent from my iPhone using Tapatalk

 

I believe it does support it.

 

You might have to play around with getting it to accept the certificate.  I did for Google Drive.

 

It's extremely easy to build and schedule the task.  There is essentially a web based wizard that walks you through the process.  It's FOSS, so I highly recommend installing it and trying it out.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...