Cp -r *.* /<dest Folder>: What Does *.* Literally Mean In Linux?

I have been using this command a lot lately:
Code:
sudo cp -r *.*  /<dest folder owned by root>

Goal: to copy all contents of current folder into destination folder.

I know there are other ways, and have tried several, but gave up in confusion (was also in a hurry to get on with it). Finally gave the windows *.* a shot, and it did exactly what I wanted.

Now that my task is completed, it would be nice to know what *.* literally means in Linux. I just looked though man cp, and also ran Code:
info coreutils 'cp invocation'

but didn't see any '*' options listed. Now I remember having come across '.' before somewhere, in the cp command, so possibly the '*' are ignored, and the only thing that matters is the 'dot'?


Similar Content



Find 30 Days Old And Delete Prints Error Msg File Not Found After Deleting It

I have a shell script to find folders which are 25 days older and delete it, and put the deleted folder details into log file like this

Code:
 find /ahome/xxx/$FOLDER -type d -mtime +25  -exec ls -ld {} \;  -exec rm -rf {} \;  >> mylogfile.log

after running this command it deletes the folder and logs the folder deleted. But also print error msg
Code:
find: /ahome/prksh/dir/test: No such file or directory

How to suppress the error msg

How To Run Shell Script In Any Folder Except The Folder Which Contains The Shell Script

I am using a shell script named by test.sh, for example containing
Code:
address="$PWD"
echo "$address"

.
If I put it in folder temp1, and run test.sh, then it will give me the address of the current directory. But if I am now in folder temp2, and I want to run test.sh, I always need to copy test.sh to folder temp2, and then run it. Is there a way that I can run test.sh without copying it? I am not root user.

Is It Food Have Find Command Running In Cron Every Day Once To Delete Older Files

I need to clean-up some folder. I have a cron job which uses find command to find and delete 25 days older file.

Code:
find $FOLDER_PATH/$FOLDER -depth -regex  ^$FOLDER_PATH/$FOLDER/[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+$ -type d -mtime +25  -exec ls -ld {} \;   >> /tmp/deleted_folders.log

When this running it occupies 1-3% of cpu. And it may take longer time based on the folder size.

Is it ok to have find command running as cron job ?

Why Should I Always Use Chmod When Not As A Root User

System Info:

I have normal user in CentOS 7 whose name is "mostafa" (the name of the account).

I naturally have another user called root with all privileges. User "mostafa" is put into sudoers file, too.

The OS is installed in VmWare, so the system is all mine.

Problem:

Now I create a file with touch file.sh and put a command in it, but when I want to run it with Code:
sudo ./file.sh

, an error is shown that the command Code:
./file.sh

does not exist. But if I Code:
 sudo chmod 777 ./file.sh

then it gets run. My question is that, why should I use Code:
chmod 777

when I myself have created the file, and I am in sudoers.

Can anyone explain me why shuold I still use Code:
sudo chmod 777

when the creator of the file is me.

Syntax Help To Delete Some Files From A Directory On Centos

Hi

I have a folder :

Code:
/usr/local/src/myfolder

and i have there a few folders and files ...

Now i want to delete from this folder the files named:

Code:
file1.txt 
image.jpg
info.html
another.txt

and leave all the rest folders and files...

How is the correct syntax for this?

If it is possible to not use cd /usr/local/src/myfolder and then rm -r .... so i can run it from everywhere ....

Thanks

Please Help! I'm Trying To Install A Program Through The Terminal

I'm sorry to ask such a simple question but every time I think I figure something out or think I'm going somewhere get stopped dead in my tracks. My friend installed linux on my computer and then moved away immediately so I have no one to help me!

I downloaded the webcam program Cheese. I found through some tutorials that I have to copy the folder to the /opt/ folder and the only way to do that is through the terminal. I found this tutorial and still can't manage to copy the folder. These are the problems I run into:

1. I don't know where to open the terminal. I've tried opening it from the extracted Cheese folder in my downloads folder and from the /opt/ folder itself. It opens up but:
2. when I type the command sudo cp -r cheese /opt/ it will ask for modernnewspeak's password. When I try to type this in nothing happens, even though it JUST let me type in the command. I pressed "enter" thinking maybe it was hiding my password and I get the message "cannot stat 'cheese'. No such file or directory"

please tell me what I'm doing wrong! I tried to follow that tutorial and look through the FAQ here but I cannot figure this out. Thank you in advance!

Command Manual Working But Not On Cron

Hi

When i run this command manually on Centos 6.6 it works:

Code:
/usr/bin/find /backup/ -type d -mtime +1 -print0 | xargs -0 rm -rf

but as a cron job it doesn't as i can see a folder with files there from Mar 28:

Code:
55 5 * * * /usr/bin/find /backup/ -type d -mtime +1 -print0 | xargs -0 rm -rf

And here are the logs from cron that it is executing this at the correct time :

Code:
Mar 30 05:55:01 server CROND[9526]: (root) CMD (/usr/bin/find /backup/ -type d -mtime +1 -print0 | xargs -0 rm -rf)

Any ideas why?

Thanks

Can Anybody Explain How Copy.com Works To Me?

I'm running Xubuntu and it was a challenge just getting Copy.com on there. (I installed the desktop app on both of my computers.) Now that I have it though, I don't really know how to use it.

I know this is kind of more a Copy.com question, but I don't know anything about Copy.com (besides having it--lol) and besides, I like you LQ guys.

So yeah, I installed the desktop app for Copy.com on both of my computers. I know that if I put something in the Copy folder that will be available to both computers.

But how Copy does the backing up I don't know.

When I change a file or folder do I have to plop that into the Copy folder every time or does Copy somehow update the file or folder in the Copy folder automatically? (It doesn't seem to.)

Okay, when I, say, take the Documents folder from one computer and plop it into the Copy folder that's that. Then I take the Documents folder from the other computer and plop that into the Copy folder, then all the files from both folders will be in the Copy folder (and the Copy cloud), right?

Now I just removed a couple of files from a folder and copied and pasted the folder into the Copy folder. But then when I looked at the Copy folder the files I'd deleted were still there. What's the process? How does it work?

I mean, how does this work as a way of backing things up AND organizing things? To me it seems like a decent way of throwing stuff into the Copy folder (and cloud), but how is that different than Google Drive? I mean, that's not really a backup, is it? It's like a flash drive in the cloud.

And when I combined the same folders (with the same titles anyway, but they each had different files within them) from the two computers I'd expected each folder on each computer to have all the same files that were cumulatively on both. Instead, they're the same. And the cumulative is only on the Copy folder.

I like the notion of just throwing the folders and files into the Copy folder. It's much quicker than Google Drive. But the backing up feature eludes me and the syncing feature makes me fearful that I'll lose data or that the files will become hopelessly less organized.

Thanks.

Copy Encrypted Compact Flash Issues With Dd In Ubuntu

Hello, I have a system that uses a compact flash with a windows os and some other files on it, also somewhere is some sort of encrypted licensing information. I have several of these machines and can use the cf from the others just fine in this machine. But when I take one of those cards and try to copy it with dd, somehow the machine can tell the difference. It's nothing illegal, it's just too old to buy the replacement. Someone has told me they copied successfully in linux with the dd command, but mine aren't working. I also can't tell the brand or type of cf since all the labels have been removed. All i know is that it's a 256mb card. So is there any other options besides dd, or is there a deeper level of dd that i can use to copy this info. I'm using something like:

sudo dd if=/dev/sde of=/home/folder/cfcard

then to copy from my hard drive to the blank cf:

sudo dd if=/home/folder/cfcard of=/dev/sde

I'm using a usb cf reader, and when i have my finished cf everything looks good. Even the machine can read it, it just gives me an error that the cf card isn't a licensed or corrupted.

Borked Ubuntu With Putty SSH Cmd Line

I am not sure what happened here, and I find it oddly disturbing that a Putty session from Window could do this, but here is what I did -

I was attempting to open a ssh redirect session from Putty command line on my W7 work PC to my home PC running Ubuntu 14.04. I've done this before with the Putty GUI and had no issues at all, but this time I was using cmd line and it never did connect. I ran this command:
Code:
putty -ssh -D 1080 -P 22 domain.com

The Putty window opened but never connected. I tried twice and when it didn't connect, I gave up and went back to the GUI. I connected with the GUI just fine, but after connecting an SSH session to my Ubuntu server at home, I noticed it was acting odd.

Long story short: It was in a "read-only" mode, saying the file system was read-only. I couldn't run apt-get update or even create a new folder in my Home folder or desktop. I remotely rebooted the machine and it never came back online.

When I got home I checked it, and it told me that Ubuntu had found some errors, and was asking if I wanted to correct them. I said yes and it took a short while, but then finally let me log back in.

After this, the box is about 90-95% back to normal except for a couple of odd things that I don't understand.

The main issue is that my Samba shared drives are no longer working. The fstab file looks the same and can connect to other machines, but nothing can connect back to the Ubuntu box. The folder permissions are correct also. I have a CentOS7 box, a Fedora laptop and a W7 box - none of them can connect to my Ubuntu box. I only run Samba and connect with it, even from Linux boxes, just to make it simple (since Windows is stupid and can't use NFS) - this has always worked in the past. Now for the life of me, I can't figure out why nothing will connect.

Is there a logfile or trouble-shooter I could look at it see what happened? It should be in /var/log somewhere, but I don't know where to start. SSH? Samba? The entire filesystem was read-only for a short while, so is there a FS or System log in /var/log?

I am kinda confused on this one, any help is appreciated.