Please forgive me but I'm a little new to Red Hat (RHEL 5). I'm using rysnc to backup critical data and to a second disk; here is what I'm typing at the command line rsync -rvgal /data/disk1/share /data/backup/share. It appears that the softlinks are not transfered to the backup drive and some of the links point to data not located in the source folder (/data/share). After reading the rsync man page I was a little confused about the L option (vs the l option). In order to ensure that the linked files are moved should I type the below:
rsync -rvgaL /data/disk1/share /data/backup/share
A million thanks,
Johnny Mac
What I did in windows was create images of my drive and restore them.
in linux I am running
Code:
rsync -aAXv --exclude={"/home/*","/dev/*","/proc/*","/sys/*","/tmp/*","/run/*","/mnt/*","/media/*","/lost+found"} /* /path/to/backup/folder
and this creates a folder for me with all my files, and apparently saves meta data like permissions and paths...
Since I'm using arch and things break sometimes,I'm booted into a CLI with errors and cannot figure my way out since I'm a noob... would I be able to just delete my entire root and replace it with the rsync backup without a problem?
gold finger was kind enough to share this with me a while ago:
Quote:
Do backups to either another HDD, partition, or a USB stick (if big enough to hold your data). Can use program to make an initial backup of /home/gregory; then use it to periodically update that backup by having it sync between your installed Xubuntu /home/gregory and the backup copy. The sync function will just copy over things that are new or changed, rather than copying everything all over again.
Assuming your Xubuntu filesystem is Ext4, example of doing initial backup would be something like this:
* Spare USB with large partition formatted as Ext4 and labeled "BACKUPS"
* Open luckybackup and choose "Backup" function
* "Source" = /home/gregory
* "Destination" = /media/gregory/BACKUPS (might be under /media/BACKUPS)
* Check box to not create new directories (it will just do exact copy of source)
After initial backup, either make a new task for syncing, or modify the backup task to turn it into a syncing task instead. Then use that periodically to update the backed-up /home/gregory.
I've downloaded Luckybackup and have been experimenting with it but I'm still not sure the best way to go about using it as a backup. Like in gold finger's advice, why would I check the box to not create new directories? It seems to me doing it without checking the box re-creates things just the way they are on my computer. When I check the box it just takes everything out of the folders. Seems confusing (and unnecessasry). And I have a really hard time finding the errors after a run and when I do find them I do I don't know what they mean. And so if I backup the source destination it makes an exact copy on my destination drive (with folders if I don't check the box, without if I do). Then if I do that as an ongoing thing, I will be backing up all my data with each run (which I'm assuming would be much more time consuming), whereas if I choose 'syncrhonize source and destination' it will only backup the changes in my source and usb drive (which would be my destination drive)?
Is that the idea?
And I noticed that Lucky did not want to transfer things with colons in them. Googling around somebody said that problem would be taken care of by switching to ext 3 or ext 4 for formatting the destination drive (as gold finger suggested). Is this a good idea? (I've always felt comfortabel with FAT because if I needed to plug my flash drive into Microsoft it would work (as well as with Linux).)
So the first time I use Lucky I choose "backup source inside destination" and of course the source and destination. Should I check the "Do NOT create extra directory" box? (Again, that seems off as 95% of what I'll be backing up is in folders.)
Then after I've done that, I choose the snyc option?
A lot of stuff. I know. Thanks.
PS. As a slight complication I have the data (basically the "home" folder) of my two computers (work and home) synced via Copy.com.
I'm not sure if this should be in the newbie section, but I am somewhat of a newbie, so here goes:
In a home network, I have an Xubuntu file server with a Samba share that has me as the owner and authorizes me to access the share.
On another computer, I have Mint running and providing various services, including webdav on Apache with SSL. In the var/www/webdav directory of the Mint computer, I have the Xubuntu Samba share mounted. This is supposed to allow me to access the Samba share from the public internet.
Everything works fine except for one big problem: Apache requires the owner of the webdav directory to be user "www-data," and I can't figure out how to give www-data access to the Samba share, since www-data is not a user on the Xubuntu computer, and moreover I don't know the password for user www-data.
Can anyone figure out how to get around this problem? In particular, is there a way to configure the Samba share on the Xubuntu computer so that user www-data on the Mint computer can have access to it?
(Incidentally, I have my reasons for using two computers, one as a file server and one as a web server. Also, I am thinking about switching to NFS instead of Samba, but I'm not sure if even that would solve my problem.)
I am using Rsync to backup files to a another machine, the users on my fileserver do not exist on the backup server so Rsync throws errors about the permissions. It copies the files fine but I want to get rid of the errors and have Rsync ignore the permissions when backing up.
/backup is a mounted ftp directory
Below is the current command and output:
Code:
root@Fileserver:~# rsync -av --delete /shared/fileshare/ /backup/backup
building file list ... done
created directory /backup/backup
./
manager/
manager/chironfs.txt
manager/cronman.txt
manager/curlftpfs.txt
manager/curlman.txt
manager/getnetaddress.txt
manager/grepman.txt
manager/rsyncman.txt
manager/tarman.txt
public/
user1/
user10/
user2/
user3/
user4/
user5/
user6/
user7/
user8/
user9/
rsync: chown "/backup/backup/manager/.chironfs.txt.c6MbJ7" failed: Operation not permitted (1)
rsync: chown "/backup/backup/manager/.cronman.txt.hdBG4P" failed: Operation not permitted (1)
rsync: chown "/backup/backup/manager/.curlftpfs.txt.t1sG4L" failed: Operation no t permitted (1)
rsync: chown "/backup/backup/manager/.curlman.txt.6oWPoW" failed: Operation not permitted (1)
rsync: chown "/backup/backup/manager/.getnetaddress.txt.V8z8Kk" failed: Operatio n not permitted (1)
rsync: chown "/backup/backup/manager/.grepman.txt.REh4WW" failed: Operation not permitted (1)
rsync: chown "/backup/backup/manager/.rsyncman.txt.ho8VNM" failed: Operation not permitted (1)
rsync: chown "/backup/backup/manager/.tarman.txt.BkcmeS" failed: Operation not p ermitted (1)
sent 211115 bytes received 274 bytes 6710.76 bytes/sec
total size is 210263 speedup is 0.99
rsync error: some files could not be transferred (code 23) at main.c(977) [sende r=2.6.9]
root@Fileserver:~#
I tried the flag to adding the no flag to -p but it still didn't work, see below:
Code:
root@Fileserver:~# rsync -av --no-p --delete /shared/fileshare/ /backup/backup
building file list ... done
./
manager/
manager/chironfs.txt
manager/cronman.txt
manager/curlftpfs.txt
manager/curlman.txt
manager/getnetaddress.txt
manager/grepman.txt
manager/rsyncman.txt
manager/tarman.txt
public/
user1/
user10/
user2/
user3/
user4/
user5/
user6/
user7/
user8/
user9/
rsync: chown "/backup/backup/manager/.chironfs.txt.6Q3eP2" failed: Operation not permitted (1)
rsync: chown "/backup/backup/manager/.cronman.txt.FC8Orx" failed: Operation not permitted (1)
rsync: chown "/backup/backup/manager/.curlftpfs.txt.mlVSN9" failed: Operation not permitted (1)
rsync: chown "/backup/backup/manager/.curlman.txt.vlJ4b1" failed: Operation not permitted (1)
rsync: chown "/backup/backup/manager/.getnetaddress.txt.LXmft0" failed: Operation not permitted (1)
rsync: chown "/backup/backup/manager/.grepman.txt.SVuaye" failed: Operation not permitted (1)
rsync: chown "/backup/backup/manager/.rsyncman.txt.KTNYqA" failed: Operation not permitted (1)
rsync: chown "/backup/backup/manager/.tarman.txt.zcU90c" failed: Operation not permitted (1)
sent 211115 bytes received 274 bytes 7686.87 bytes/sec
total size is 210263 speedup is 0.99
rsync error: some files could not be transferred (code 23) at main.c(977) [sender=2.6.9]
Hello everyone,
I recently had an issue where I lost my whole backup server due to an electrical overload causing my server to literally explode and fried all 4 of my terabyte drives.... needless to say, I have no more backups because of this, and everywhere I read about backups said that setting up a raid array would allow me to keep good backups.... boy did I learn this lesson the hard way in needing to have some sort of external backup option, which brings me to this post and my questions:
I'm using Ubuntu 14.04 LTS server on an older Dell Poweredge 600sc, and I was thinking of using WD Passport 1Tb external drives to be used as my "offsite" backup option. I don't have a lot of data, and my current backup schedule is only a weekly backup, so thinking that if I have two of these passport drives so that I can have one drive offsite and one attached to the server, and rotate them every 4 weeks so as not to loose all my data.
Here's my question: Ideally, I would love to just be able to unplug the current drive, plug in the new drive and have everything work. However, I don't see this actually working, but if there's a way to do this, that would be totally awesome.... ;-)
So, realistically, I know I will have to unmount the one drive, unplug it, then plug in the new drive and mount it on the system. Is there a way to mount this to the same mount point automatically so that I don't have to rewrite my backup script each time I swap drives out so that the backups go to the same mount point? Or will the UUID's get messed up each time I do this?
Hopefully this makes sense and an easy solution can be found to accomodate this idea.....
Thanks again for all your help. This site is awesome for newbies such as myself........
Mikey
This is probably very basic, but I am totally new to Linux.
I am tryng to sync a usb data stick to Documents directory using:
rsync -auvhi /media/2C24-DC8F/ ~/Documents/
I then sync the other way [rsync -auvhin ~/Documents/ /media/2C24-DC8F/] expecting little or nothing, but end up with a massive list. A typical item being: .f.....g... boats/Siskiwit-Bay-SOF.pdf
My very limited undertanding makes me think that the group has changed during the first operation despite using -a, which I thought preserved the group.
Basically I am trying to keep /Documents and my data stick identical as I move between computers. Any suggestions?
Thanks
Hi All,
I have Redhat5.3 running on my machine. I have 800 GB data mounted on /dev/sdb1 partition. When I reboot my machine and after reboot the data is not available in /data folder. I have already use mount command but its not working.
Can anyone help me how to retrieve those data.
am trying to use tar in combination with find, the goal is to all files in /export that have been modified in the last 24 hours (back up purposes), then tar them so I can untar on the backup server, updating just the modified files.
Perhaps there is a better way, however, I have tried using cpio but the problem come in when I copy to the NAS drive (NTFS) I lose all my owner/group and permissions. I have found that if I tar the files, then copy them to the NAS, when I untar on the server, it will retain the owner/group and permissions.
So… here is what I have tried:
First, I use the find command to see what files should be in the tar archive.
Code:
/export $ find . -depth -mtime 0 -print
./file4
./file3
.
Ok, that looks right, now I will try to pipe that in to tar
Code:
/export $ find . -depth -mtime 0 -print0 | tar -czvf backup.tar.gz --null -T -
./file4
./file3
./
./share/
./share/pdf/
./share/pdf/penny-2014-09-03-11:41.30.pdf
./share/pdf/penny-2014-09-03-14:25.17.pdf
./share/pdf/penny-2014-09-03-11:24.36.pdf
./share/pdf/penny-2014-09-03-14:37.12.pdf
tar: ./share/pdf/.directory: Cannot open: Permission denied
./share/pdf/penny-2014-09-02-14:52.06.pdf
./share/pdf/penny-2014-09-03-12:18.43.pdf
tar: ./share/PDF: Cannot open: Permission denied
./share/file3
tar: ./share/.directory: Cannot open: Permission denied
./dir1/
./dir1/file1
./file4
./file2
./file3
tar: ./.directory: Cannot open: Permission denied
./list
tar: Exiting with failure status due to previous errors
It seems that it is trying to tar all the files in that directory. When I view the files in backup.tar.gz all of the files from /export are in there not just the modified ones
hey guys, sorry for asking a dumb question, but some of the terminology online is not grasping me well, and still leaving me confused
in crontab -e
I have a shell script that contains
rsync /home/willc86 willc86@server02:/home/willc86/backup
the script works good by itself, however i have to put in a password which I understand. i know crontab works, because i used rsync script in the local machine. I am just having problems trying to sync it to another server
how do i get this rsync going in crontab to back it up to another server.
Hello everybody,
I am seeking help regarding sharing internet connection. I would be greatful.
I have;
A Laptop : Samsung RV509, i3, 300GB HDD, 3GB Ram, Dual boot Windows7 & LM 17 cinnamon 32bit. WiFi & Bluetooth available.
A Desktop PC : Celeron CPU 2.4GHZ, 40 GB HDD, 1GB RAM, LiveUSB-LM 17 cinnamon 32bit, No WiFi hardware available, No Bluetooth hardware available.
A mobile phone : Nokia Asha 500, WiFi & Bluetooth available.
An old USB data cable: with which the mobile phone can be connected to the computers. I have successfully connected and transfered data back & forth, and also shared/connected Mobile-Broadband-internet to both Laptop and Desktop.
What I have been doing now is; I have an internet connection to my Laptop with an external modem and Ethernet. Its working fine. I have an unlimited plan and so I want to share this internet connection on the mobile phone too; instead of incuring extra cost by connecting to the internet directly through the phone. I am successful at that. I can share this internet connection on my phone through WiFi.
Now, what I want to do is; without buying any extra hardware, to share this same internet connection on my Desktop PC, too.
What I tried is; I searched the web, but didn't find any solution. Now I am here. Please kindly help me.
Thank you & Regards
Anil