Script To Scan /home For Folders With 777 Permission

Hello

How can I prepare a script which will scan /home for folders with 777 permission and then it will set 755 permissions on those folders.

command find /home/ -type d -perm 777

will scan the /home for folders with 777 permission, but know I need to set 755 permission on this folders.

Please help in resolving this issue.

Thank you

Regards!
Jeff80


Similar Content



Hidden Folders And Files Become Viewable In Home Directory

Hi guys,
.
For no apparent actions from me, hidden folders and files show
in my /user/home directory, they are as follows:-

folders:
.adobe .cache .config .cups .filezilla .gimp-2.8 .gnupg .gphoto .gstreamer-0.10 .icedtea .java .local .macromedia .mozilla .pki .thumbnails

Files:
.bash_history .bashrc .esd_auth .ICEauthority

In my / directory
File: ./readahead

Seeking help to verify the above folder and files are not from a harmful source or application?

If they do not post any thread to the system, how can I conceal
these folders and files, so that they don't show up any more in
my home and / directory ?

Many thanks.

Rsync Copy Permission Denied

hi experts

I am rsyncing a user's home dir across the NFS and the local PC, but when it tries to copy over the hidden files it fails with permission denied. Both dir are owned by the proper user and I am root when I execute the script, so I am not sure what went wrong here.
For example: this is the content and permissions of the source:

-rw------- 1 user test 115 Nov 14 11:28 .bash_history

and here is my error:

rsync: send_files failed to open "/home/user/.bash_history": Permission denied (13)

Thanks

I Cannot Copy With Original Timestamp! PLEASE HELP

Hi all,
I've knowledge about timestamp and i'm trying to use it in a particular scenario. I've multiple folders inside which are different files. Now I'm trying to copy one file (say xyz) which is present in all the folders but has variation in it's content and time of creation into a let's say foldernew.
I'm trying to do this by copying the file xyz from each folders with the new name xyz_(it's orginal timestamp) into folder new.
Can this be done with a single command or do what should I write in a script to execute this?

Note: I want to add the timestamp of xyz when it is created not of the time of copy.

How Do I Untar A Tar.gz File To Multiple Directories?

Hey everyone,

I am trying to find a way to untar a file.tar.gz in my home/noob directory into 15 different directories.

where the tar.gz file is located: /home/noob
where the 15 directories are located: /home/noob/Staging/ 1-15 folders

Without having to extract the tarball individually, is there a simple way in one or two commands that can take the tarball and extract it to all 15 folders ?

Please and thanks!

Sharing Folders And Mounting Shares With SetGID / Samba

OK this is kinda long, so I will shorten it as much as I can, as to not be long-winded.

My current network at home:
1 - CentOS 7 desktop (server)
1 - Ubuntu 14.04 desktop
1 - Fedora 21 laptop
2 - Windows 7 desktops
some other various windows boxes also that don't get used regularly, but are on the network.

My 2 Linux desktops (which I use as servers, but they really aren't) have shared folders on them, which I share to the network via Samba (CIFS). I use Samba because Linux is smarter than Windows and Windows won't read NFS, so I share them as Samba so all devices can see them.

Generally speaking, if I share the folders on each box as 0777, I have no issues. But lately I have been wanting to implement some better security, so I wanted to SETGID and chown the shared folders from the local machine to a specific group, then change the folders to 2774.

My problem is that I keep getting permissions errors when trying to connect from the other Linux machines, and sometimes the Windows machines also. My main question is: do I CHMOD 2774 the local mount-point before mounting it? Or so I CHMOD 2774 the shared folder on the other server, then mount it locally to a folder whose permissions are different? Or do I CHMOD both of them the same?

basically the uis and gid ownerships change on a local folder when I mount a shared drive to that folder, so when I try to write or sometimes read that local folder, I get permissions errors.

I can provide any additional info needed.

Setfacl Help

I can't believe I wrote a looong message and it logged me out when I tried to submit it.

So anyway, in short lines:

- I have a network of sites where all sites share same "images" folder
- I have created /home/_images/entities and symlinked it from all websites
- It works great with Apache, when I open /images/ on any of the sites I get list of images and can view them

The problem is suPHP which changes process ID of the PHP script to the file owner ID, so when I load site1.com, all scripts are executed as user1 (and files/folders created with those scripts belong to user1:user1). When I load site2.com, all scripts are executed as user2 (and files/folders created with those scripts belong to user2:user2). All these users do NOT belong to the same group, and I wouldn't like to change that as it is cPanel/WHM server so I'm afraid I'll screw something up if I change (primary?) group of all users.

Therefore I need to set it up in such way that all newly created folders and files under /home/_images/entities (owned by root) have read/write permissions for everyone.

Here's the command I used:

Code:
setfacl -Rdm o::rwx /home/_images/entities

To check it:
Code:
root@server1 [~]# getfacl /home/_images/entities/
getfacl: Removing leading '/' from absolute path names
# file: home/_images/entities/
# owner: root
# group: root
user::rwx
group::rwx
other::rwx
default:user::rwx
default:group::rwx
default:other::rwx

This looks fine, however when I try upload an image via site1.com it looks like this:

Code:
root@server1 [/home/_images/entities]# ls -l
total 24
drwxrwxrwx+ 5 root    root    4096 Jan 14 06:25 ./
drwxrwxrwx  5 root    root    4096 Jan 12 13:08 ../
drwxrwxr-x+ 3 user1   user1   4096 Jan 14 06:25 1/

And in folder "1" is the image (and thumbs folder):

Code:
root@server1 [/home/_images/entities/1]# ls -l
total 236
drwxrwxr-x+ 3 user1   user1     4096 Jan 14 06:25 ./
drwxrwxrwx+ 5 root    root      4096 Jan 14 06:25 ../
-rw-rw-rw-  1 user1   user1   225569 Jan 14 06:25 689048f221ab7c556f4d482a9d92b2d6.jpg
drwxrwxr-x+ 2 user1   user1   4096 Jan 14 06:25 thumbs/

My questions:

1) Why newly created folders do not have "write" permissions for everyone else [not user and/or group]? If I upload first image from site1.com, then I can't upload other images from any other site, while all sites can display them.

2) What is the + at the end of permissions list? (drwxrwxr-x+)

3) Why newly created files have only "rw" permissions for user, group AND everyone else, and not execute permissions? I don't actually need execute flag set here, but from my command you can see I've set "o::rwx" so it should be there (or not?)

Actually the real problem is #1 - other users can't write to this folder so users can't upload images from other sites nor other sites can create (missing) thumbnails.

Mint KDE Instalation And Lost Files

Some weeks ago I installed Linux Mint 17.1 Cinamin. Yesterday I installed 17.1 KDE. Now I have an home directory with empty set of user folders. I did find all of my files under devices 129.0 GiB Hard drive, including the old Home folder. How do I get my old files back to where I can use them?

Inet.conf Never Start The Ftp Daemon

Hi,

i am trying the run a ftp daemon on my board (sbc6000x) but it seems that the ftpd service is never started by inetd.conf.

My inetd.conf is :

Code:
#<service_name>	<sock_type>	<proto>	<flags> <user> 	<server_path> 	<args> 

ftp			stream		tcp		nowait	root	/usr/bin/ftpd ftpd -w
telnet			stream		tcp		nowait	root    /usr/sbin/telnetd

Telnet don't run too but this is because i am missing some folders and i can launch it manually after creating the folders without issue, but maybe the fact that it can't run telnet because of the missing folders make it stop trying the ftp daemon ?

I check with the command "ps", the processus is not running hidden anywhere.

Tar Is Not Extracting All Files

Hi,

I have an tar.gz file which when right click extract here on my ubuntu computer is extracted without issue. I have all the folders and all the files (links and one executable) in them.
But when i transfert this tar.gz to another system and use the command :
Code:
tar zxvf nameofthefile.tar.gz

I see in the terminal things like that :
Code:
usr/bin/unzip/lzop        
usr/bin/xargsbin/mountpoin
usr/bin/telnet
bin/pipe_prog
usr/bin/lzma        
bin
usr/bin/sort

which mean that the files must be extracted but when using "ls" intot he folders they are empty. All the links are not here and only the executable is extracted.

When i copy back the archive from the new system to my ubuntu system and extract it all the files are here without issue. Why is it only creating the folders and not extracting the links ?

Duplicate Folder Creation While Using Mkdir In A Script

Hello,

I am setting up a linux server for gaming and I am using a script to update the files automatically and create a folder with a certain name.

Code:
# !/bin/bash

# A convenience function, to save us some work
update_server() {
	# Read the app id and the directory into a variable

	APP_ID=$1
	DIR=$2

	# Create the directory ( if it does not exist already )
	if [ ! -d "$HOME/$DIR" ]; then
		mkdir -p "$HOME/$DIR"
	fi

	# Uh-oh, it looks like we still have no directory. Report an error.
	if [ ! -d "$HOME/$DIR" ]; then
		# Describe what went wrong
		echo "ERROR! Cannot create directory $HOME/$DIR!"

		# Exit with status code 1 ( which indicates an error )
		exit 1
	fi

	# Call SteamCMD with the app ID we provided and tell it to install
	./bin/steamcmd.sh +login anonymous +force_install_dir "$HOME/$DIR" +app_update $APP_ID validate +quit
}

# Now the script actually runs update_server ( which we just declared above ) with the id of the application ( 4020 is Garry's Mod ) and the name of the directory we want the server to be hosted from:

update_server 4020 "gmodserver"

exit 0

When I run this script, it creates 2 folders on my server : gmodserver and gmodserver? There is no files downloaded in gmodserver. All the files are downloaded in gmodserver?

I looked for a few hours on how to solve this problem but I have no idea what the added ? might be so I am lost as to what to look for. Could you help me on figuring this out?

Thank you.

edit : I am using ubuntu 15.04 x64 if it makes a difference.