Linux Bulk Renaming Files

Hello Folks.

I'm searching for a easy way to rename multiple files from CLI but didn't find any easy way for me so I'm reaching out to you guys for help.

This is what I want to do (from CLII or script). I want to move files with a sequence number on the name of the files (msg0000, msg0001, msg0002 and so on) to let's say msg0066, msg0067 and so on. Each of this file name has two other files (msg0000.wav, msg0000.WAV and msg0000.txt).

The idea is to move them from one directory to another and following a sequence in the file names, is there a way I can do this pain free?

Any help on this matter will be greatly appreciates and I'm talking about over 100 files I need to move following the sequence of the receiving directory.

Thanks!


Similar Content



Creating A Large Tar Ile Form A Sequence Of Small Tar Files And One Files Is Missing

I am trying to put back together a big atr file from some smaller tar files that I created several years ago. The issue is that in order to tar this large file, I must put each file back using the command

tar -xMf cd-1.tar
Prepare volume #2 for 'cd-1.tar' and hit return:n cd-2.tar
Prepare volume #3 for 'cd-2.tar' and hit return:n cd-3.tar
and so forth.

I have fourteen files cd-1.tar through cd-15.tar. The cd-9.tar files is missing and I assume that it is gone. Now when I type the commands in I get the following:

Code:
-linux tarfile]$ tar -xMf cd-1.tar
Prepare volume #2 for `cd-1.tar' and hit return: n cd-2.tar
Prepare volume #3 for `cd-2.tar' and hit return: n cd-3.tar
Prepare volume #4 for `cd-3.tar' and hit return: n cd-4.tar
Prepare volume #5 for `cd-4.tar' and hit return: n cd-5.tar
Prepare volume #6 for `cd-5.tar' and hit return: n cd-6.tar
Prepare volume #7 for `cd-6.tar' and hit return: n cd-7.tar
Prepare volume #8 for `cd-7.tar' and hit return: n cd-8.tar
Prepare volume #9 for `cd-8.tar' and hit return: n cd-10.tar
tar: This volume is out of sequence (10755138772 - 4889670868 != 6598651392)
Prepare volume #9 for `cd-10.tar' and hit return: n cd-10.tar
tar: This volume is out of sequence (10755138772 - 4889670868 != 6598651392)
Prepare volume #9 for `cd-10.tar' and hit return: 
tar: This volume is out of sequence (10755138772 - 4889670868 != 6598651392)

As you can see I do not have cd-9.tar. That stops the untarring cold. However, I have cd-10.tar,cd-11.tar,cd-12.tar,cd-13.tar,cd-14.tar,cd-15.tar. Now I may have these files, but they cannot be put back in the main file because cd-9.tar is missing and everything must be put in sequentially.

Is there a way to complete this sequence of steps and add all fourteen files to the files bigbackup leaving out cd-9.tar? That means that the bigbackup file will be incomplete, but that is better than no file or having bigbackup missing six files on the back end.

Any help appreciated.

Thanks in advance.

Respectfully,


Newport_j

Move And Extract Specific Files From Sub-directories Into Parent Directory.

Hello all.
I have 100 sub-directories that "rar" files exist in them, How can I move all rar files into parent directory?

I used "find sourcedir -type f -exec mv {} targetdir \; " but i just copy one file and all files deleted

Thank you.

How To Rename A File In Linux

hello,

I am trying to rename the file by adding .txt extension and also
before renaming, I want to replace . in file with _

right now file looks like this mdm.201504021628

after execution of my script file name should be mdm_201504021628.txt


#!bin/bash
//reading all files from directory

files=$(hadoop fs

-ls /dl/data/landing/hivedb/lnd_attunity_kpi_db_backup/auth_master |
awk '!/^d/ {print $8}')

for f in $files; do

//using sed to replace . with _ and then feeding to hadoop fs command

sed 's/./\_/g' $f | hadoop fs -mv $f $f.txt

done

Thanks for your help in advance.

Script To Recursively Enter Subdirectories And Rename Files Sequentially From Scratch

I am new to Bash scripting.

I have a main directory called Photos which has many subdirectories like People, Places and Things. Each of these subdirectories is populated by other subdirectories and lots of JPG photo images.

The digital cameras name the files in a way that is difficult to manage with web hosting.

I would like to go to each directory and subdirectory and rename the photos 1.jpg, 2.jpg, 3.jpg, etc. so that I can use a simple XML template to access them by specifying only a hosting directory.

I tried to use the following script:

#! /bin/bash

cd /home/paul/test

find . -name "*.jpg" -print0 | rename -v 's/.+/our $i; sprintf("%d.jpg", 1+$i++)/e' * -vn

exit 0

It successfully renames all of the files in all of the directories, but it does not restart the numbering for each new subdirectory. So first it goes through Photos and renames the three JPG files there 1.jpg, 2.jpg and 3. jpg, and then it opens the first subdirectory People and names the three JPG files there 4.jpg, 5.jpg and 6.jpg. Next it moves to the next subdirectory and continues sequential renaming until it is done.

I want it to restart sequential renaming with each new subdirectory, so that after renaming the three JPG files in Photos to 1.jpg, 2.jpg and 3.jpg, it moves to the first subdirectory and renames the JPG files there starting with 1.jpg again.

That way I use the links 1.jpg, 2.jpg, 3.jpg, etc in the XML template and just change the directory name to download the photos from the web.

Thanks for any help you can give me.

How To Get Around Tar'ing Up Files You Don't Have Permissions?

I would like to tar up some system logs with core files as a normal user through a script. However these core files are owned by root.

Of course I get "cannot read directory ...: Permission denied"

Is there a good way to grab these core files through a script? From searching around, I get that the best way is to modify the sudoers file, then run the script with sudo. Is this considered the safest and best way?

Thanks!

Help-Need To Access Files In Crashed Ubuntu Running In Windows 2008 Server HyperV

Help!! In 12.04.5 Ubuntu, the /boot directory was full and had a lot of duplicate files from different dates, some I thought were unnecessary headers of updates. However, one series of files was called vmlinuz-3.2.0-29-generic. I moved all the files to /obsolete_files. I rebooted after having trouble upgrading packages (I think this is a second problem with this machine). I now get the following prompt.
grub rescue>

Is there a way to access the directory system so that I can move those files back? I really only need to retrieve one or two files off the system.

Code:
grub rescue>ls
(hd0) (hd0,msdos5) (hd0,msdos1) (fd0)

It doesn't under boot, or insmod. I can set prefixes. Am I doomed?

File Permissions Between Users

Hello Everyone! I'm somewhat new to linux, and getting my feet wet by building my first linux server.

So what i have is an application that moves/sorts files. Another program that catalogs them.

The problem is that each app uses it's own user. So my question is if there is any way that files owned by prog1user can be read by prog2user?

I have tried doing a chmod -R 755 Directory and that has allowed the second program to see the files, but I'm guessing this has certain security risks (although I'm not so worried about the files in this directory).

Anyways I was wondering if there was a proper way to do this? OS is debian wheezy.

Cheers!

Hidden Folders And Files Become Viewable In Home Directory

Hi guys,
.
For no apparent actions from me, hidden folders and files show
in my /user/home directory, they are as follows:-

folders:
.adobe .cache .config .cups .filezilla .gimp-2.8 .gnupg .gphoto .gstreamer-0.10 .icedtea .java .local .macromedia .mozilla .pki .thumbnails

Files:
.bash_history .bashrc .esd_auth .ICEauthority

In my / directory
File: ./readahead

Seeking help to verify the above folder and files are not from a harmful source or application?

If they do not post any thread to the system, how can I conceal
these folders and files, so that they don't show up any more in
my home and / directory ?

Many thanks.

How To Delete Number Of Files

I'm trying to figure out if find could do this. I have a folder with 1000 files. I want to delete 150 files on that folder regardless of timestamp and filename. Is there a tool, command or option on find that could do this, please let me know.

Combining mtime or ctime to find is not advisable since it will not count the files or even if there are matches, I would still need to sum up the files until I reach 150 files.

Any suggestions?

How To Filter Multiple Csv Files According To A Complete Rows

Hi Everyone, I have multiple csv files(>100). They are rain-gauge stations files for precipitation measurement. In these files, the numbers of stations are not equal(i.e. there are missing stations). I want only the stations that are present in all the files. The files have unique station id in column #3. I want to ask if this is possible in Linux?

It may be something along: for h in *.cvs; do sed '?????' $h > rippe_$h && mv rippe_$h $h.xls ; done