Hello every one,
I need to copy many files in to many folders in once, for example I have files result_1 result_2 result_3 and want to copy them to directories 1 2 3 respectively, how can I do that, I have tried cp, echo and find syntax but did not work out.
Any idea?
I'm running Xubuntu and it was a challenge just getting Copy.com on there. (I installed the desktop app on both of my computers.) Now that I have it though, I don't really know how to use it.
I know this is kind of more a Copy.com question, but I don't know anything about Copy.com (besides having it--lol) and besides, I like you LQ guys.
So yeah, I installed the desktop app for Copy.com on both of my computers. I know that if I put something in the Copy folder that will be available to both computers.
But how Copy does the backing up I don't know.
When I change a file or folder do I have to plop that into the Copy folder every time or does Copy somehow update the file or folder in the Copy folder automatically? (It doesn't seem to.)
Okay, when I, say, take the Documents folder from one computer and plop it into the Copy folder that's that. Then I take the Documents folder from the other computer and plop that into the Copy folder, then all the files from both folders will be in the Copy folder (and the Copy cloud), right?
Now I just removed a couple of files from a folder and copied and pasted the folder into the Copy folder. But then when I looked at the Copy folder the files I'd deleted were still there. What's the process? How does it work?
I mean, how does this work as a way of backing things up AND organizing things? To me it seems like a decent way of throwing stuff into the Copy folder (and cloud), but how is that different than Google Drive? I mean, that's not really a backup, is it? It's like a flash drive in the cloud.
And when I combined the same folders (with the same titles anyway, but they each had different files within them) from the two computers I'd expected each folder on each computer to have all the same files that were cumulatively on both. Instead, they're the same. And the cumulative is only on the Copy folder.
I like the notion of just throwing the folders and files into the Copy folder. It's much quicker than Google Drive. But the backing up feature eludes me and the syncing feature makes me fearful that I'll lose data or that the files will become hopelessly less organized.
Thanks.
Hello all.
I have 100 sub-directories that "rar" files exist in them, How can I move all rar files into parent directory?
I used "find sourcedir -type f -exec mv {} targetdir \; " but i just copy one file and all files deleted
Thank you.
Hi all,
I've knowledge about timestamp and i'm trying to use it in a particular scenario. I've multiple folders inside which are different files. Now I'm trying to copy one file (say xyz) which is present in all the folders but has variation in it's content and time of creation into a let's say foldernew.
I'm trying to do this by copying the file xyz from each folders with the new name xyz_(it's orginal timestamp) into folder new.
Can this be done with a single command or do what should I write in a script to execute this?
Note: I want to add the timestamp of xyz when it is created not of the time of copy.
Does anyone know a way to copy two files to multiple computers? I'm thinking of scp as the flavor of linux we're using does not include rdist.
I've read that scp can't copy multiple files, however maybe some scripting genius has figured out a way. Running two scripts (one for each file), is perfectly ok!
If anyone care to post very clear examples (i'm definitely not a programmer...) of scripts, etc, that would be great.
Thanks in advance to all those who can help!
Hello all,
I was wondering if there was a way to automatically copy files from an SD card to a DVD burner. I would prefer a differential copy as well.
Any ideas?
Hi all
I got two questions about rsync:
1. I am copying a huge dir with lots of files, how do I see the overall progress? I put in the progress flag but now it show the progress each individual files.
2. Is rsync slower than cp? I used CP to copy over a NFS mount and that took about an hour. I am using rsync to copy over the same mount over the same NFS and so far it has taken 10 minutes to copy 7G. I got over 150G of stuff.
I just used rsync -avh --progress ...
Thanks
Davy
I want to copy a folder and have the same permission as a existing folder , I tried cp -r -p may copy permission but it also copy everything includes sub-directory and files to new folder , would advise the way how to create a folder with copying all files to new created folder ? thanks
gold finger was kind enough to share this with me a while ago:
Quote:
Do backups to either another HDD, partition, or a USB stick (if big enough to hold your data). Can use program to make an initial backup of /home/gregory; then use it to periodically update that backup by having it sync between your installed Xubuntu /home/gregory and the backup copy. The sync function will just copy over things that are new or changed, rather than copying everything all over again.
Assuming your Xubuntu filesystem is Ext4, example of doing initial backup would be something like this:
* Spare USB with large partition formatted as Ext4 and labeled "BACKUPS"
* Open luckybackup and choose "Backup" function
* "Source" = /home/gregory
* "Destination" = /media/gregory/BACKUPS (might be under /media/BACKUPS)
* Check box to not create new directories (it will just do exact copy of source)
After initial backup, either make a new task for syncing, or modify the backup task to turn it into a syncing task instead. Then use that periodically to update the backed-up /home/gregory.
I've downloaded Luckybackup and have been experimenting with it but I'm still not sure the best way to go about using it as a backup. Like in gold finger's advice, why would I check the box to not create new directories? It seems to me doing it without checking the box re-creates things just the way they are on my computer. When I check the box it just takes everything out of the folders. Seems confusing (and unnecessasry). And I have a really hard time finding the errors after a run and when I do find them I do I don't know what they mean. And so if I backup the source destination it makes an exact copy on my destination drive (with folders if I don't check the box, without if I do). Then if I do that as an ongoing thing, I will be backing up all my data with each run (which I'm assuming would be much more time consuming), whereas if I choose 'syncrhonize source and destination' it will only backup the changes in my source and usb drive (which would be my destination drive)?
Is that the idea?
And I noticed that Lucky did not want to transfer things with colons in them. Googling around somebody said that problem would be taken care of by switching to ext 3 or ext 4 for formatting the destination drive (as gold finger suggested). Is this a good idea? (I've always felt comfortabel with FAT because if I needed to plug my flash drive into Microsoft it would work (as well as with Linux).)
So the first time I use Lucky I choose "backup source inside destination" and of course the source and destination. Should I check the "Do NOT create extra directory" box? (Again, that seems off as 95% of what I'll be backing up is in folders.)
Then after I've done that, I choose the snyc option?
A lot of stuff. I know. Thanks.
PS. As a slight complication I have the data (basically the "home" folder) of my two computers (work and home) synced via Copy.com.
Hi,
I have an tar.gz file which when right click extract here on my ubuntu computer is extracted without issue. I have all the folders and all the files (links and one executable) in them.
But when i transfert this tar.gz to another system and use the command :
Code:
tar zxvf nameofthefile.tar.gz
I see in the terminal things like that :
Code:
usr/bin/unzip/lzop
usr/bin/xargsbin/mountpoin
usr/bin/telnet
bin/pipe_prog
usr/bin/lzma
bin
usr/bin/sort
which mean that the files must be extracted but when using "ls" intot he folders they are empty. All the links are not here and only the executable is extracted.
When i copy back the archive from the new system to my ubuntu system and extract it all the files are here without issue. Why is it only creating the folders and not extracting the links ?
hello all,
please help me with python ftplib. i was trying to copy files from my linux machine to a windows server using ftplib. everything was working good. but i'm only able to copy files from the same directory the script is. how do i copy files from a different directory? i always get "file not found error message". here's my code :
Code:
tester_name = str (socket.gethostname())
def upload(ftp, file):
ext = os.path.splitext(file)[1]
if ext in (".txt", ".htm", ".html"):
ftp.storlines("STOR " + file, open(file))
else:
ftp.storbinary("STOR " + file, open(file, "rb"), 1024)
parse_source_path = ('/path/to/where/i/go/')
parse_source_file_list = os.listdir(parse_source_path)
ftp = ftplib.FTP("server_IP")
ftp.login("username", "pass")
folder_list = []
ftp.dir(folder_list.append)
if str(tester_name) not in str(folder_list) :
ftp.mkd("%s"%tester_name)
ftp.cwd("%s"%tester_name)
for files in parse_source_file_list :
print files
upload(ftp, files)
else :
print "later"