Hi,
I'm wondering if anyone can help me make a script that searches through a specific folder (in this case /tmp ) for files with a given permissions (755) and then delete all the other files with different permissions?
The correct permission should be, as mentioned 755, and those are the files that should be kept (not deleted).
All other files in this folder with different permissions should be deleted.
Thanks!
I would like to tar up some system logs with core files as a normal user through a script. However these core files are owned by root.
Of course I get "cannot read directory ...: Permission denied"
Is there a good way to grab these core files through a script? From searching around, I get that the best way is to modify the sudoers file, then run the script with sudo. Is this considered the safest and best way?
Thanks!
Directory /media/data/torrents/ has permissions 775, user yzt, group transmission
yzt and debian-transmission are members of the group transmission.
transmission-daemon is run by debian-transmission, and the new files it downloads have permissions 644, owner debian-transmission, group transmission. This is a problem, because I can't later move the files as my user, yzt, and need to be switching to root to change the permissions/ownership to be able to do so.
Using sticky bit I could copy it to anywhere else, but I'm interested on actually moving the file, not just copying it. I could run transmission-daemon as yzt and problem solved, but I rather have that internet facing service running by a limited user, just in case some vulnerability is found on Transmission.
So my question is, how can I set that every new file created under /media/data/torrents/ has permissions 775 like its parent directory?
I'm trying to figure out if find could do this. I have a folder with 1000 files. I want to delete 150 files on that folder regardless of timestamp and filename. Is there a tool, command or option on find that could do this, please let me know.
Combining mtime or ctime to find is not advisable since it will not count the files or even if there are matches, I would still need to sum up the files until I reach 150 files.
Any suggestions?
I was working on my PC and by mistake, I deleted a folder that contained a lot of mp3 files(songs) that I dont want to lose... It was in /home... What do I do now to regain those files with subfolders also??? I saw videos on youtube which were very confusing... Please help... _/\_
HI Everyone, need a shell script which checks size of a directory and if it gets greater than 1 GB then delete all the files in it.
ex: dir : /var/log/
/log contains numerous .log files, so script should delete .log files always when /log size is greater than 1 GB , it should not delete the directory
I'm running Xubuntu and it was a challenge just getting Copy.com on there. (I installed the desktop app on both of my computers.) Now that I have it though, I don't really know how to use it.
I know this is kind of more a Copy.com question, but I don't know anything about Copy.com (besides having it--lol) and besides, I like you LQ guys.
So yeah, I installed the desktop app for Copy.com on both of my computers. I know that if I put something in the Copy folder that will be available to both computers.
But how Copy does the backing up I don't know.
When I change a file or folder do I have to plop that into the Copy folder every time or does Copy somehow update the file or folder in the Copy folder automatically? (It doesn't seem to.)
Okay, when I, say, take the Documents folder from one computer and plop it into the Copy folder that's that. Then I take the Documents folder from the other computer and plop that into the Copy folder, then all the files from both folders will be in the Copy folder (and the Copy cloud), right?
Now I just removed a couple of files from a folder and copied and pasted the folder into the Copy folder. But then when I looked at the Copy folder the files I'd deleted were still there. What's the process? How does it work?
I mean, how does this work as a way of backing things up AND organizing things? To me it seems like a decent way of throwing stuff into the Copy folder (and cloud), but how is that different than Google Drive? I mean, that's not really a backup, is it? It's like a flash drive in the cloud.
And when I combined the same folders (with the same titles anyway, but they each had different files within them) from the two computers I'd expected each folder on each computer to have all the same files that were cumulatively on both. Instead, they're the same. And the cumulative is only on the Copy folder.
I like the notion of just throwing the folders and files into the Copy folder. It's much quicker than Google Drive. But the backing up feature eludes me and the syncing feature makes me fearful that I'll lose data or that the files will become hopelessly less organized.
Thanks.
Hello Everyone! I'm somewhat new to linux, and getting my feet wet by building my first linux server.
So what i have is an application that moves/sorts files. Another program that catalogs them.
The problem is that each app uses it's own user. So my question is if there is any way that files owned by prog1user can be read by prog2user?
I have tried doing a chmod -R 755 Directory and that has allowed the second program to see the files, but I'm guessing this has certain security risks (although I'm not so worried about the files in this directory).
Anyways I was wondering if there was a proper way to do this? OS is debian wheezy.
Cheers!
Hi
I have a folder :
Code:
/usr/local/src/myfolder
and i have there a few folders and files ...
Now i want to delete from this folder the files named:
Code:
file1.txt
image.jpg
info.html
another.txt
and leave all the rest folders and files...
How is the correct syntax for this?
If it is possible to not use cd /usr/local/src/myfolder and then rm -r .... so i can run it from everywhere ....
Thanks
I want to copy a folder and have the same permission as a existing folder , I tried cp -r -p may copy permission but it also copy everything includes sub-directory and files to new folder , would advise the way how to create a folder with copying all files to new created folder ? thanks
I am writing a script to check the log file in /var/log , some files are only read by root , but the script use general user to access the system ( as the system do not accept root to ssh directly ) , therefore , it could not read such files .
Would advise what is the best method to solve this problem , copy the file to a special folder ? allow root to ssh ? create a user with root permission ?
very thanks