Backup Secrets

Best practices for Windows administrators

Featured Contributors

There are no featured contributors yet.

How to backup large data sets

May 18, 2009, 11:08 pm

Summary: Issues with backing up large data sets can be overcome by using a backup system based on “intelligent differentials” where each backup results in a new “full backup” performed at the speed of a differential backup.

No matter what the organization’s business processes are, most SMBs are plagued with the same type of problems when it comes to backing up their data:

Thankfully, there are strategies to help overcome such issues which we’ll outline in the following article. In fact, if you follow these suggestions, you will be able to backup terabytes of data, with daily backup times of less than an hour!

Fixing the backup device storage-capacity problem

The storage size of the backup device can prove to be an issue because the growth capacity of many backup devices may not keep up with the expansion of a company’s data storage requirements. Tapes for instance can severely restrict the amount of data that can be backed up and even though tape drives may have data compression enabled, certain data files that are already at their maximum compression cannot be compressed any further.

The best solution to this dilemma is simply to choose a disk based backup device as your backup destination. Disk based backup devices such as eSata, USB hard drives and NAS devices store anywhere between 250GB to 1.5TB and can be purchased at affordable prices. Thus, using a disk based backup device means organizations can back up large amounts of data and can eliminate the hassle of having to split backup sets into smaller portions.

Improving backup speeds

The more data an organization has, the longer it will take to perform a simple full backup. According to some tests we performed in our labs, we found that we can backup to the fastest USB hard-drive device at 30 MB/sec or 108 GB/hr. Extrapolating on these results, we can assume that performing a full backup of 1TB of data would take up to 10 hours!

One way to get around this issue would be if we performed incremental or differential backups every day instead of full backups. Currently, there are new “intelligent differential” technologies which can be used to store large amounts of backup history on a single device. The way these “intelligent differential” backups work is that they take an existing full backup from the backup device and merge any new changes from the backup source with the full backup, thus creating a new full backup. This new full backup reflects the current state of the volume being backed up. Any differences that were replaced during the merge will be stored as past versions on the backup device and will be deleted as necessary.

This backup technology means that all backups are full backups even though they are performed at the speed of a differential backup. In addition, full backups can be restored in one restore operation instead of having to restore a full backup and subsequent differential and incremental backups as found with traditional differential backup technology.

How can I use this new “intelligent differential” method

Currently, there are three different backup technologies that use the “intelligent differential” method:

BackupAssist version 5 is a one-stop backup solution which integrates window imaging, rsync backups and file replication at an affordable price.

User Comments:
  • June 30, 2009, 2:12 am
    truly amazing article...
    posted by: the maestro
  • July 24, 2013, 7:38 pm
    Wonderful items from you, man. I've consider your stuff prior to and you're sipmly extremely fantastic. I actually like what you have received here, certainly like what you are saying and the best way through which you assert it. You are making it enjoyable and you still care for to stay it wise. I can not wait to read much more from you. That is actually a great web site.Pretty great post. I just stumbled upon your weblog and wished to say that I have really loved surfing around your weblog posts. In any case Ia1afll be subscribing to your feed and I hope you write again very soon!
    posted by: Netalie
  • July 26, 2013, 4:41 am
    Good website! I truly love how it is easy on my eyes and the data is well wrtetin. I'm wondering how I could be notified when a new post has been made. I have subscribed to your RSS feed which must do the trick! Have a great day!
    posted by: Ruslan
  • July 26, 2013, 1:31 pm
    Getting along with AppDevs is generally quite easy on the<a href="http://wrrsmbzbu.com"> sacoil</a> level. We're all nerdy, mostly all like technology and building software. However, the applications and the process don't necessarily get along with my job duties. For example, AppDev team has app design discussions that don't include the DBA and you get an operational impedance mismatch. Or an AppDev team wants to use an ORM(no biggie in itself) but doesn't want to learn how to use the ORM or thinks that the simple examples they see online will translate well to our existing system that isn't meant for an ORM(worst case scenario). Another relationship killer is the seemingly omnipresent scope creep that comes with poor planning. They say they want it one say, you and the AppDevs build it that way, then the request comes down to make it another way. At that point you and the AppDevs start the Technology-Rock Paper Scissors match for who is going to compromise on this new request that leaves both the AppDevs and the DBA frustrated. There are also the days when someone is just phoning it in but I classify that as just a people/workplace thing and not directly related to the general AppDev vs DBA relationship.
    posted by: Andrea
  • July 28, 2013, 1:14 pm
    I said it’s “mostly good” now. I am well respected for my skill set. However, at the place I used to work, it was holtise. That’s where I started to become a DBA. I was a .NET software engineer at the time and, due to the hostility between the software engineers and the DBA group, my team lead decided that we needed a covert DBA so that we could expose our data to different groups under the radar so those groups could do business. That’s where I came in. I showed a lot of promise in the ways of SQL Server and ETLs and I had a generally dismissive attitude toward authority. Luckily, the DB change scripts were run by a separate team and the DBAs didn’t believe in monitoring. I loved it so much I went from covert to overt. As I remember the complaints from my old team lead, I try to keep the servers clean, fast, and secure without being arbitrary or obtuse. That seems to serve me well in my current position. http://lvfwjcr.com [url=http://iguduadjqzs.com]iguduadjqzs[/url] [link=http://qgctlrxc.com]qgctlrxc[/link]
    posted by: Marmar
  • September 25, 2015, 6:32 am
    I've recently steatrd at a new company as their DBA. Prior, I was a .NET developer and BSA for 13 years. I am currently reporting to the App Dev manager and have a great relationship with the dev teams. However, I am pushing to move my position to report to Operations as that's a better fit. I find that while I have great information and work with the dev team when they are coding and designing, I have little feedback and input with the operations team on architectural and planning.It's a toss up where a DBA should fit, but I feel for the larger picture, a DBA needs to have a very tight relationship with operations and planning of the overall infrastructure first and then with application development teams second.
    posted by: Hazel
  • October 8, 2015, 2:18 pm
    A Dislaimer: I used to work in a software shop<a href="http://fvsxotkjan.com"> acsros</a> the street from them. In college, I actually used the stuff. Soap up your back, back up your soap.I like step #3, especially before step #4, where one would check that there are actually some backups:How do I restore my backup? FAQ12731. Launch OpenBaseManager in the OpenBase folder of your applications folder. Click Local. 2. Stop the xsila_db.db database by clicking the stop icon, the first of several to the right of the database. Quit OpenBaseManager.3. Delete the actual database you're replacing from the location Macintosh HD>Users>Shared>OpenBase>Databases.4. Replace it with the latest backup found in the location Macintosh HD>Library>OpenBase>Backup. You must first unzip the archived backup, and then move the resulting xsilva_db file to the location Macintosh HD>Users>Shared>OpenBase>Databases.5. Wait one minute, and then double-click the new xsilva_db file, which will launch OpenBaseManager. Click Local and start the database by clicking the first icon in the row. 6. Once the database has been started, you should be able to log into LightSpeed. Keywords:restore, backupword: oceuksu
    posted by: Dimitris
  • October 8, 2015, 8:14 pm
    Some years ago, I was reading a data wnaihouserg book by Gary Dodge and Tim Gorman and came across a simple sentence that changed how I looked at backups. I don't recall the exact wording, but the basic idea is 'The responsibility of the dba is not to backup the database, but to restore it.' It was an epiphany. The shame is that there are so many opportunities for an organization and dba to practice recovery. Refreshes, new server installs are two that are rather common. And you can do these recoveries without the pressure of a down production system and with a decent amount of sleep.And for my personal machine...I synchronize my laptop with an external drive. And that external drive is backed up to another external drive. Since I seem to reinstall Windows every 6 months and I often pull old files from my external drives, I know they are good...at least for now. Tomorrow is always another story. http://vchdyphe.com [url=http://aisxtq.com]aisxtq[/url] [link=http://bariwozgd.com]bariwozgd[/link]
    posted by: Delfi
  • October 9, 2015, 2:33 am
    I'm wondering what the deal is with zip files. Specifically, the 10gR2 hpux Itanium zip<a href="http://wjbvsvslmw.com"> dlwonoad</a>. I unzipped (winzip) it on a pc, then uploaded it to hpux with a normally rock-solid gui ftp program in binary mode. The ftp blew up consistently in the same place. On closer investigation I noticed why: there was a zip in the zip (Aurora java something), and rather than uploading it as a file, it made the zip file a directory and barfed. At the beginning of the files it also had a zero length css file that seemed to hang it for a long time. Wassup?ftp'ing the original zip and using the Oracle provided unzip worked just fine.Which just goes to show, you can have all the tested recovery procedures with rock-solid software, and a slight change of procedure can still mess you up.
    posted by: Anna
  • October 9, 2015, 10:48 am
    Zenith Infotech has a pretty nifty sotuilon for onsite and remote backups. Also capable of virtualizing a failed server the is being backed up. We just started rolling them out to some clients and things have been really smooth so far. http://qblfmccwtm.com [url=http://dkohomep.com]dkohomep[/url] [link=http://ztfaymibqk.com]ztfaymibqk[/link]
    posted by: Meme
Please post a comment