What Does A Dot Character Mean In Linux?

Hi folks,
i was reading a tutorial about how to find files and folder sizes and came a cross this example

Quote:
root@host [5045 02:30:13 /home/johnc/public_html/wp-content]# du -h --max-depth=1
613M ./cache
4.0K ./reports
200K ./sedlex
24K ./wp-content
4.0K ./wppa-depot
40M ./backup-db
16K ./bps-backup
112K ./includes
57M ./plugins
8.0K ./ngg_styles
21M ./themes
362M ./gallery
4.0K ./banners
269M ./lg-gallery
4.1M ./ewww
4.0K ./upgrade
226M ./uploads
28K ./w3tc-config
1.7G .
You see at the bottom it indicates a 1.7G . (dot) . what does that dot mean?


Similar Content



Rsync, Reliable "copy And Paste" Type Of Backup In Case Things Break?

What I did in windows was create images of my drive and restore them.

in linux I am running

Code:
rsync -aAXv --exclude={"/home/*","/dev/*","/proc/*","/sys/*","/tmp/*","/run/*","/mnt/*","/media/*","/lost+found"} /* /path/to/backup/folder

and this creates a folder for me with all my files, and apparently saves meta data like permissions and paths...

Since I'm using arch and things break sometimes,I'm booted into a CLI with errors and cannot figure my way out since I'm a noob... would I be able to just delete my entire root and replace it with the rsync backup without a problem?

Utilizing Rsync To Backup Data But Symbolic Links To Included

Please forgive me but I'm a little new to Red Hat (RHEL 5). I'm using rysnc to backup critical data and to a second disk; here is what I'm typing at the command line rsync -rvgal /data/disk1/share /data/backup/share. It appears that the softlinks are not transfered to the backup drive and some of the links point to data not located in the source folder (/data/share). After reading the rsync man page I was a little confused about the L option (vs the l option). In order to ensure that the linked files are moved should I type the below:

rsync -rvgaL /data/disk1/share /data/backup/share

A million thanks,
Johnny Mac

How To Select Newest "dated" Directory?

Hi,
I have a /backup directory which contains 5 days worth of backups:
2015-03-12_03-01-07
2015-03-11_03-01-07
2015-03-10_03-01-07
2015-03-09_03-01-07
2015-03-13_03-01-07

I need to copy the content of the NEWEST backup. Ideally, I'd like to find newest backup and assign it to a variable for later use.
I know I can Code:
ls -lt

to sort directories, but how do I pick the first one and store it into a variable?

Thanks,

Command Manual Working But Not On Cron

Hi

When i run this command manually on Centos 6.6 it works:

Code:
/usr/bin/find /backup/ -type d -mtime +1 -print0 | xargs -0 rm -rf

but as a cron job it doesn't as i can see a folder with files there from Mar 28:

Code:
55 5 * * * /usr/bin/find /backup/ -type d -mtime +1 -print0 | xargs -0 rm -rf

And here are the logs from cron that it is executing this at the correct time :

Code:
Mar 30 05:55:01 server CROND[9526]: (root) CMD (/usr/bin/find /backup/ -type d -mtime +1 -print0 | xargs -0 rm -rf)

Any ideas why?

Thanks

Move A Folder With All Contents

hi

i need to move a directory from a subdomain to the root

the folder i need moved is in subdomains/magdev/public_html/images

and I need to move up to the root public_html of my hosting space, my structure looks like this

public_html
-images
subdomains
-magdev
--public_html
---images

So I need to move the bottom images directory to the top one

Can anyone help


Cheers

Using Find And Pipe To Tar

am trying to use tar in combination with find, the goal is to all files in /export that have been modified in the last 24 hours (back up purposes), then tar them so I can untar on the backup server, updating just the modified files.

Perhaps there is a better way, however, I have tried using cpio but the problem come in when I copy to the NAS drive (NTFS) I lose all my owner/group and permissions. I have found that if I tar the files, then copy them to the NAS, when I untar on the server, it will retain the owner/group and permissions.

So… here is what I have tried:

First, I use the find command to see what files should be in the tar archive.
Code:
/export $ find . -depth -mtime 0 -print
./file4
./file3
.

Ok, that looks right, now I will try to pipe that in to tar
Code:
/export $ find . -depth -mtime 0 -print0 | tar -czvf backup.tar.gz --null -T - 
./file4
./file3
./
./share/
./share/pdf/
./share/pdf/penny-2014-09-03-11:41.30.pdf
./share/pdf/penny-2014-09-03-14:25.17.pdf
./share/pdf/penny-2014-09-03-11:24.36.pdf
./share/pdf/penny-2014-09-03-14:37.12.pdf
tar: ./share/pdf/.directory: Cannot open: Permission denied
./share/pdf/penny-2014-09-02-14:52.06.pdf
./share/pdf/penny-2014-09-03-12:18.43.pdf
tar: ./share/PDF: Cannot open: Permission denied
./share/file3
tar: ./share/.directory: Cannot open: Permission denied
./dir1/
./dir1/file1
./file4
./file2
./file3
tar: ./.directory: Cannot open: Permission denied
./list
tar: Exiting with failure status due to previous errors

It seems that it is trying to tar all the files in that directory. When I view the files in backup.tar.gz all of the files from /export are in there not just the modified ones

Problem With Running Bash Script

Hi

I have just started to use Linux on my Raspberry Pi to host a home automation server & I'm having a problem when trying to run a bash script.

The script in question is to turn off my Viera TV & is as follows
Code:
#!/bin/sh
curl -i \
-H "Accept: text/xml" \
-H "Cache-Control: no-cache" \
-H "Pragma: no-cache" \
-H 'SOAPACTION: "urn:panasonic-com:service:p00NetworkControl:1#X_SendKey"' \
-H "Content-Length: 200" \
-H 'Content-Type: text/xml;charset="utf-8"' \
-X POST --data '<s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/" s:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/"> \
<s:Body> \
<u:X_SendKey xmlns:u="urn:panasonic-com:service:p00NetworkControl:1"> \
<X_KeyEvent>NRC_POWER-ONOFF</X_KeyEvent> \
</u:X_SendKey> \
</s:Body> \
</s:Envelope>' http://192.168.1.87:55000/nrc/control_0/

If I run this from the command line it works fine but when I try to run it from within the Home Automation application it returns error code 32512 which I've seen elsewhere is actually exit status 127 & basically down to not being able to find the program to execute.

Permissions are fine & I've tried using the full path name for both curl and the script itself but I still get the error. Has anyone any idea what I need to change as this looks to be a pure Linux (or rather my misunderstanding of Linux) issue rather than the home automation program

Thanks

Steve

Understanding Configuration Files Better

Hey, I'm aware that /etc/ stores config files and in my home directory I also have dot files as well as a .config folder.

And I'm told not to edit /etc/ but create a copy in my home directory to preserve original files. Is it as simple as creating the full path the same as /etc/ and editing it in home folder?

Ideally this is how I hope it works, because I don't want to edit /etc/ and end up with a bunch of custom, non default files.

Need Help Understanding Luckybackup

gold finger was kind enough to share this with me a while ago:

Quote:
Do backups to either another HDD, partition, or a USB stick (if big enough to hold your data). Can use program to make an initial backup of /home/gregory; then use it to periodically update that backup by having it sync between your installed Xubuntu /home/gregory and the backup copy. The sync function will just copy over things that are new or changed, rather than copying everything all over again.

Assuming your Xubuntu filesystem is Ext4, example of doing initial backup would be something like this:

* Spare USB with large partition formatted as Ext4 and labeled "BACKUPS"
* Open luckybackup and choose "Backup" function
* "Source" = /home/gregory
* "Destination" = /media/gregory/BACKUPS (might be under /media/BACKUPS)
* Check box to not create new directories (it will just do exact copy of source)


After initial backup, either make a new task for syncing, or modify the backup task to turn it into a syncing task instead. Then use that periodically to update the backed-up /home/gregory.
I've downloaded Luckybackup and have been experimenting with it but I'm still not sure the best way to go about using it as a backup. Like in gold finger's advice, why would I check the box to not create new directories? It seems to me doing it without checking the box re-creates things just the way they are on my computer. When I check the box it just takes everything out of the folders. Seems confusing (and unnecessasry). And I have a really hard time finding the errors after a run and when I do find them I do I don't know what they mean. And so if I backup the source destination it makes an exact copy on my destination drive (with folders if I don't check the box, without if I do). Then if I do that as an ongoing thing, I will be backing up all my data with each run (which I'm assuming would be much more time consuming), whereas if I choose 'syncrhonize source and destination' it will only backup the changes in my source and usb drive (which would be my destination drive)?

Is that the idea?

And I noticed that Lucky did not want to transfer things with colons in them. Googling around somebody said that problem would be taken care of by switching to ext 3 or ext 4 for formatting the destination drive (as gold finger suggested). Is this a good idea? (I've always felt comfortabel with FAT because if I needed to plug my flash drive into Microsoft it would work (as well as with Linux).)

So the first time I use Lucky I choose "backup source inside destination" and of course the source and destination. Should I check the "Do NOT create extra directory" box? (Again, that seems off as 95% of what I'll be backing up is in folders.)

Then after I've done that, I choose the snyc option?

A lot of stuff. I know. Thanks.

PS. As a slight complication I have the data (basically the "home" folder) of my two computers (work and home) synced via Copy.com.

After Rebuild Of Tslib Qt Failed With Error Ts_config() 'no Such File Or Dir'

I downloaded tslib-1.1.tar.gz and configured to cross compiled for my ARM target machine.

Was able to build and make install on my Ubuntu host machine successfully. On my taget machine I copied created a backup of my original libts-1.0.so.0.0.0 and SFTP'd from my host the version I built to my target machine and over wrote current libts-1.0.so.0.0.0 to keep the links.
reboot my taget and now get

QWSTslibMouseHandlerPrivate: ts_config() failed with error: 'No such file or directory'
Please check your tslib installation!

running ldd on the taget against the backup (.bu version) and new versions shows the same dependancies.

root@at91sam9m10g45ek:~# ./ldd /usr/lib/libts-1.0.so.0.0.0
libdl.so.2 => /lib/libdl.so.2 (0x4000b000)
libc.so.6 => /lib/libc.so.6 (0x40016000)
/lib/ld-linux.so.3 (0x2a000000)
root@at91sam9m10g45ek:~# ./ldd /usr/lib/libts-1.0.so.0.0.0.bu
libdl.so.2 => /lib/libdl.so.2 (0x4000b000)
libc.so.6 => /lib/libc.so.6 (0x40016000)
/lib/ld-linux.so.3 (0x2a000000)

Using nm verified ts_config() existed in the new version.

Any suggestions why I am getting the error?