I need to back up my LINUX data, what is the best solution?
I need to back up my LINUX data, what is the best solution?
This is a "re-hosting" of the thread in the T Series forum just for us Linux folks.
I've been tweaking my Ubuntu install for a week or so now, and have got most stuff working to my satisfaction. I'm getting ready to try some more potentially damaging experiments (i.e. trying to get some Windows stuff running under WINE), but before I do that I'd like to backup my current install.
Under Windows I use a combination of IBM R&R (to do "pseudo-image" backups) and MS Backup (to do file and settings backups). Both of them save the backups to a Buffalo Tech LinkStation NAS - a small box with a 160 GB HDD running embedded Linux that sits on my home network.
The first task was to get the Linux install talking to the Linkstation, and that turned out to be considerably harder than I expected. While I could see and browse the Linkstation in Nautalis (as smb:name) it stubbornly refused to mount as a Samba share from the command line using mount -t smbfs.
I finally found (in a dim corner of a site devoted to hacking Liinkstations) a brief note that Ubuntu users needed to apt-get install smbfs before they could hack it because Ubuntu for some unholy reason does not include Samba file system support in the default install. The fact that Nautalis is able to browse Samba shares is simply a red herring thrown in to confuse newbies such as myself because Nautalis doesn't actually mount the devices?!
So after installing smbfs I can now mount the Linkstation. BTW - if anyone has one of these and is feeling the need to turn it into a full fledged Linux server there is quite an active hacking community devoted to installing everything from client-server backup software to video streaming tools on them. Google for Linkstation Hacking.
I've been searching and reading and playing around with Linux backup tools, and I haven't really found anything I entirely happy with yet.
For file and settings backup I found the simplebashbu.sh script in the Universe reporitory, which is intended to run as a cron job nightly. I don't leave my laptop on overnight but you can invoke it from the command line. It basically TARs a bunch of directores to a device - you can configure the directories and the backup location in the script, and it accepts the mount point for the Linkstation with no problem. I tried it out last night and it is S.L.O.W. I killed it after about 5 minutes, it had done about 200 MB worth of data. I've got around 20 GB to backup. At that rate it will take about 500 minutes, or 8 hours for a full backup.
For image backup I found a description in "Linux Desktop Hacks" of how to boot into a command line interface, load some tools, set the system partition as Read Only, and then use Dump to image it. I haven't tried it yet but it seems like a lot of work and cumbersome to use on a regular basis - plus it has no provision for incrementals. I could boot from the Live CD and do the Dump from there, but my Linux drive is currently residing in my Ultrabay so I don't have a CD-ROM drive available without doing drive swaps.
Some of the other tools out there look neat but are really much more than I need or can implement. Amanda looks cool and it's in the Ubuntu repository, but it's client-server and I'd rather not have to hack my Linkstation and try to figure out how to get the Amanda server software running on it.
So I thought I'd throw it out to those with more experience than I. What backup system are you using, and how satisfied are you with it?
Ed Gibbs
I've been tweaking my Ubuntu install for a week or so now, and have got most stuff working to my satisfaction. I'm getting ready to try some more potentially damaging experiments (i.e. trying to get some Windows stuff running under WINE), but before I do that I'd like to backup my current install.
Under Windows I use a combination of IBM R&R (to do "pseudo-image" backups) and MS Backup (to do file and settings backups). Both of them save the backups to a Buffalo Tech LinkStation NAS - a small box with a 160 GB HDD running embedded Linux that sits on my home network.
The first task was to get the Linux install talking to the Linkstation, and that turned out to be considerably harder than I expected. While I could see and browse the Linkstation in Nautalis (as smb:name) it stubbornly refused to mount as a Samba share from the command line using mount -t smbfs.
I finally found (in a dim corner of a site devoted to hacking Liinkstations) a brief note that Ubuntu users needed to apt-get install smbfs before they could hack it because Ubuntu for some unholy reason does not include Samba file system support in the default install. The fact that Nautalis is able to browse Samba shares is simply a red herring thrown in to confuse newbies such as myself because Nautalis doesn't actually mount the devices?!
So after installing smbfs I can now mount the Linkstation. BTW - if anyone has one of these and is feeling the need to turn it into a full fledged Linux server there is quite an active hacking community devoted to installing everything from client-server backup software to video streaming tools on them. Google for Linkstation Hacking.
I've been searching and reading and playing around with Linux backup tools, and I haven't really found anything I entirely happy with yet.
For file and settings backup I found the simplebashbu.sh script in the Universe reporitory, which is intended to run as a cron job nightly. I don't leave my laptop on overnight but you can invoke it from the command line. It basically TARs a bunch of directores to a device - you can configure the directories and the backup location in the script, and it accepts the mount point for the Linkstation with no problem. I tried it out last night and it is S.L.O.W. I killed it after about 5 minutes, it had done about 200 MB worth of data. I've got around 20 GB to backup. At that rate it will take about 500 minutes, or 8 hours for a full backup.
For image backup I found a description in "Linux Desktop Hacks" of how to boot into a command line interface, load some tools, set the system partition as Read Only, and then use Dump to image it. I haven't tried it yet but it seems like a lot of work and cumbersome to use on a regular basis - plus it has no provision for incrementals. I could boot from the Live CD and do the Dump from there, but my Linux drive is currently residing in my Ultrabay so I don't have a CD-ROM drive available without doing drive swaps.
Some of the other tools out there look neat but are really much more than I need or can implement. Amanda looks cool and it's in the Ubuntu repository, but it's client-server and I'd rather not have to hack my Linkstation and try to figure out how to get the Amanda server software running on it.
So I thought I'd throw it out to those with more experience than I. What backup system are you using, and how satisfied are you with it?
Ed Gibbs
I would suggest the following: go to Gentoo forums, and search the forums for "backup script". There are some excellent tar / dar scripts created by users (nothing to do with Gentoo, simple bash scripts) that you can use. I use one I found there, which creates a full backup once a week and incremental backups when I want it to, but does it in a way so that the last backup is always the full one. Very nice script. If you dont want to use scripts, dar and Bacula are both good choices.
Stavros
Stavros
-
revolutionary_one
- Sophomore Member
- Posts: 217
- Joined: Sun Apr 24, 2005 9:13 pm
- Location: Dallas, Texas
I'm not quite sure what your looking for but there are a couple tools I use that are quite versatile.
-Mondo Archive - It's essentially a tool that makes compressed *.iso images of your partitions and allows you to save them to a local NFS or smb server. Google: "Mondo Archive"
-BUKF - This is a little perl proggy that someone in NC that i know made that *uses* mondo archive to automate and customize your backup process and it still works on remote and local servers. I'm not quite sure where the file is hosted, but i have an original copy. I'll try to get it hosted on the apache server at home real quick at request.
Alternatively, this might be useful in creating backups of builds for distro isos that might be used on special common Thinkpad models [eg 2378fvu
]. Assuming we can get momentum on a project going soon.
-Mondo Archive - It's essentially a tool that makes compressed *.iso images of your partitions and allows you to save them to a local NFS or smb server. Google: "Mondo Archive"
-BUKF - This is a little perl proggy that someone in NC that i know made that *uses* mondo archive to automate and customize your backup process and it still works on remote and local servers. I'm not quite sure where the file is hosted, but i have an original copy. I'll try to get it hosted on the apache server at home real quick at request.
Alternatively, this might be useful in creating backups of builds for distro isos that might be used on special common Thinkpad models [eg 2378fvu
Thanks.
I scanned the documentation for Mondo and it looks interesting. I'm a bit concerned about the CDs it wants you to make, as at the moment Linux is installed on a HDD in my Ultrabay. I can't have a CD and Linux mounted simltaneously. But it looks like they aren't really needed to backup to a network device, and if I had to restore I could always install a base Linux install, install Mondo on that, and then restore over the network.
I'm also a bit put off by some of the concerns about imaging an active system volume. It sounds like volume shadowing is not as advanced (or harder to do) in Linux than in Windows, so some start-to-end differences on the volume being imaged are inevitable but probably will not be fatal - am I understanding that correctly?
If that is the case, I would be more comfortable imaging with the system unmounted. It occurs to me that I've got plenty of space on my drive to make another system partition, install Knoppix or a minimal Ubuntu install, load it with the backup/restore tools I settle on, and then modify GRUB to give me a choice of booting into the main system or the backup/restore system. Then I could safely image my main system to a network drive with it unloaded.
What are the pros and cons of this approach? Obviously if I have a total disk failure I would need to recreate the restore system on a new drive before I could restore the main system, but other than that are there any downsides?
Ed Gibbs
I scanned the documentation for Mondo and it looks interesting. I'm a bit concerned about the CDs it wants you to make, as at the moment Linux is installed on a HDD in my Ultrabay. I can't have a CD and Linux mounted simltaneously. But it looks like they aren't really needed to backup to a network device, and if I had to restore I could always install a base Linux install, install Mondo on that, and then restore over the network.
I'm also a bit put off by some of the concerns about imaging an active system volume. It sounds like volume shadowing is not as advanced (or harder to do) in Linux than in Windows, so some start-to-end differences on the volume being imaged are inevitable but probably will not be fatal - am I understanding that correctly?
If that is the case, I would be more comfortable imaging with the system unmounted. It occurs to me that I've got plenty of space on my drive to make another system partition, install Knoppix or a minimal Ubuntu install, load it with the backup/restore tools I settle on, and then modify GRUB to give me a choice of booting into the main system or the backup/restore system. Then I could safely image my main system to a network drive with it unloaded.
What are the pros and cons of this approach? Obviously if I have a total disk failure I would need to recreate the restore system on a new drive before I could restore the main system, but other than that are there any downsides?
Ed Gibbs
-
revolutionary_one
- Sophomore Member
- Posts: 217
- Joined: Sun Apr 24, 2005 9:13 pm
- Location: Dallas, Texas
What's the purpose of having a backup system that doesnt to backups of active, mounted systems. LOL.
You could do well with Backup-kungfu, esp if you have dedicated NFS space to daily / weekly / monthly backups. Which i find more convenient than CD's or DVD's anyday. And yes, modifying grub to work by specifying the specific kernel image and root partition you want to use in lilo would work well, although i usually just use the backups for disaster recovery not accessable data backups.
PM me if your interested in BUKF
Also, anyone with a 2378-fvu wanna create me an image of / with mondo or bukf? I'd be much obliged, distro doesnt matter Primarily for testing...
You could do well with Backup-kungfu, esp if you have dedicated NFS space to daily / weekly / monthly backups. Which i find more convenient than CD's or DVD's anyday. And yes, modifying grub to work by specifying the specific kernel image and root partition you want to use in lilo would work well, although i usually just use the backups for disaster recovery not accessable data backups.
PM me if your interested in BUKF
Also, anyone with a 2378-fvu wanna create me an image of / with mondo or bukf? I'd be much obliged, distro doesnt matter Primarily for testing...
-
Volker
- Junior Member

- Posts: 482
- Joined: Fri Oct 01, 2004 10:21 am
- Location: Dublin, Ireland
- Contact:
backups
I'm using unison to backup my files to remote computers. It minimizes bandwidth by rsync'ing only changed files, so it is pretty quick even over slow links. One can configure it to keep a few generations around. And keeps my documents on my laptop and desktop at work are in sync, too 
But it is not a real (tape/cd/dvd) backup solution. Neither is your NAS box
Volker
But it is not a real (tape/cd/dvd) backup solution. Neither is your NAS box
Volker
Ok - here's what I've settled on for now.
I installed DAR and whipped up a couple scripts (actually stole and hacked) - one for full backups and one for incrementals. Works pretty good - did a full this morning, 15.7 GB in just over 2 hours.
Only thing I don't like is that I can't make it one big file - it blows up at 2 GB. I'm guessing that's a Samba limitation since I've got WinDohs backups on the NAS box that are over 20 GB in one file, and DAR allows you to specify slices up to Zottabytes (are they kidding with that?). So I set the -s slice parameter to 1 GB and it worked fine.
Still interested in what others are using, but I feel better now that I've saved my game.
Ed Gibbs
I installed DAR and whipped up a couple scripts (actually stole and hacked) - one for full backups and one for incrementals. Works pretty good - did a full this morning, 15.7 GB in just over 2 hours.
Only thing I don't like is that I can't make it one big file - it blows up at 2 GB. I'm guessing that's a Samba limitation since I've got WinDohs backups on the NAS box that are over 20 GB in one file, and DAR allows you to specify slices up to Zottabytes (are they kidding with that?). So I set the -s slice parameter to 1 GB and it worked fine.
Still interested in what others are using, but I feel better now that I've saved my game.
Ed Gibbs
Acronis makes a linux version of Acronis True Image Server, but it's $700 (!). However, I wonder if you could buy the Windows version ($50), create a boot CD on a Windows box, and use that to back up and restore your Linux box. (Once the machine has booted, you don't need the CD in the drive, as I recall)
I'm not a Linux user, but thought I'd throw it out as a suggestion.
I'm not a Linux user, but thought I'd throw it out as a suggestion.
560, 560x, T23, T61
Thanks,Nolonemo wrote:Acronis makes a linux version of Acronis True Image Server, but it's $700 (!)
Actually I have a copy of Drive Image laying around, and I had considered loading it on my WinDohs drive and imaging the Linux drive from there. But I'm trying to learn to use the Linux toolset.
BTW - for anyone using gnome and the Nautalis file manager under Ubuntu: If you use the Places Menu to connect to a network server it puts a link to the server on your desktop. That link is actually ON your desktop - not in /mnt like you'd think. So even if you exclude /mnt from your backup it will still see the link on your desktop and try to back up the whole server (including the backup itself) as part of your filesystem...
Every day I learn something new.
Ed Gibbs
I'm not a linux user, but I distinctly remember one of the selling features of Truimage back when I got it was it would image ANY os.egibbs wrote:Thanks,Nolonemo wrote:Acronis makes a linux version of Acronis True Image Server, but it's $700 (!)
Actually I have a copy of Drive Image laying around, and I had considered loading it on my WinDohs drive and imaging the Linux drive from there. But I'm trying to learn to use the Linux toolset.
BTW - for anyone using gnome and the Nautalis file manager under Ubuntu: If you use the Places Menu to connect to a network server it puts a link to the server on your desktop. That link is actually ON your desktop - not in /mnt like you'd think. So even if you exclude /mnt from your backup it will still see the link on your desktop and try to back up the whole server (including the backup itself) as part of your filesystem...![]()
Every day I learn something new.
Ed Gibbs
I think pretty much any imaging software out there today can handle Linux, at least the most popular file systems. EXT2 and EXT3 are ubiquitous and I can't imagine someone not supporting them. I know Drive Image does, as does BootItNG, Acronis, etc.d lehmann wrote:I'm not a linux user, but I distinctly remember one of the selling features of Truimage back when I got it was it would image ANY os.
Going the other way - imaging a Windows disk from Linux is really easy (in fact, I did it a time or two by accident
Given the cost of commercial imaging tools for Windows if someone is looking to save $50-$60 this is a good route to go. And the big bonus is that the file is in a standard format that any Linux can read and put back for you, rather than a proprietary format like most of the commercial imaging tools use.
Ed Gibbs
Go to http://www.thefreecountry.com/utilities ... mage.shtml and check out SystemRescueCD. It has everything you need to save a partition and repartition/adjust partition size on your HD as well as many recovery utils -- all on a bootable CD (which boots into Linux). I'm not a Linux user, but have used this. It took a little research, but I figured out how to mount a USB hard drive (my first time using Linux, but at one time I could work a bit in OS-9, which is similar, so at least I understand how Linux works!). Get one of the 2.5" USB drive cases for your second HD and use that. They run $20-$30 and are powered off the USB port. Then you can boot with this CD, mount the drive, and image it to your other HD. The only limitation is you need a second HD with enough room for the drive image. If you're imaging a single drive you can save the image to another partition. I shouldn't say imaging a drive -- it images partitions. Okay, images virtual drives. The image is compressed and blank areas aren't written to the image, so you need roughly half the free space as the original partition's used space in order to write the image.
Frank Swygert (USAF - retired)
-
- Similar Topics
- Replies
- Views
- Last post
-
-
Microsoft finally reveals what data Windows 10 really collects
by Puppy » Wed Apr 05, 2017 2:27 pm » in Off-Topic Stuff - 9 Replies
- 797 Views
-
Last post by shawross
Wed Apr 19, 2017 12:56 am
-
-
-
Does it make sense to use mSATA SSD for data partition?
by serpico » Wed Apr 19, 2017 9:14 am » in ThinkPad T430/T530 and later Series - 11 Replies
- 616 Views
-
Last post by axur-delmeria
Thu Apr 20, 2017 12:36 pm
-
-
-
WD Red for data drive in Thinkpad?
by ThorOfAsgard » Mon May 22, 2017 6:49 pm » in GENERAL ThinkPad News/Comments & Questions - 2 Replies
- 161 Views
-
Last post by Thinkpad4by3
Mon May 22, 2017 8:52 pm
-
-
- 5 Replies
- 1501 Views
-
Last post by slaterlp
Sat Feb 04, 2017 10:32 am
Who is online
Users browsing this forum: No registered users and 2 guests




