Hi guys,
I have multiple .csv files with multiple columns/headings, set up essentially like this (obviously more info in the real thing)
Gene Location Ref Var Coverage Function
DMD chrX.... A G 198 exonic
SCN4A chr17.... T C 111 splicing
and so on...
How could I concatenate selected columns into an output file with strings seperated with a comma? eg DMD,A,G,exonic (similar to what you can do in excel). I would like to be able to do this for multiple files in a directory. It would be preferable if all the outputs could be compiled into one file as I'll use this for something else later.
The current protocol in our team is to concatenate each file individually with an excel macro and copy into a file, and it takes a very long time.
Thanks very much!!
from the desk top i open documents file , i have multiple folders but no information Clearly what iam see is the background. how do i fix this perhaps i have deleted some thing wwhich has created the problem .
OPENING A FILE i receive multiple notification from OKULAR SAYING CANNOT OPEN FILE AND QUOTES FILES WHICH HAD BEEN DELETED .
ANY SUGGESTION ON HOW I CAN FIX THE DEFECTS .
Hi
I have about 100 RA files (RealAudio) that i want to concatenate together into one big mp3 file in the correct order. I've heard ffmpeg can do this, but I' not sure how. Can anyone help me with this?
Thanks
I am currently running a system simulation on multiple files.
I have a computer algorithm written in perl to run "system" simulations for all the files I need
What I am trying to do is put multiple files into one file, only problem is that its not doing exactly what I need it do
Example:
I am "cat txt0.txt txt1.txt txt2.txt txt3.txt > allfiles.txt"
I need it to read as
txt0.txt
txt1.txt
txt2.txt
txt3.txt
Instead its taking all the files and taking the information within each txt file and putting them all together. Info that looks like this
fdfasdfqwdefdfefdkfkkkkkkkkkkkkkkkfsdfasdxfewqfe..........
all clustered together
you get the picture ?
I am really confused how to get this to work, there are over 100 files that need to go into a single file.
That way when I run it through the perl algorithm I created, I can do it in one shot.
Does anyone know a way to copy two files to multiple computers? I'm thinking of scp as the flavor of linux we're using does not include rdist.
I've read that scp can't copy multiple files, however maybe some scripting genius has figured out a way. Running two scripts (one for each file), is perfectly ok!
If anyone care to post very clear examples (i'm definitely not a programmer...) of scripts, etc, that would be great.
Thanks in advance to all those who can help!
Hi Experts,
I am trying to make new enteries in a csv file in new column but am not able to do so.Please help for the same.
Requirement:
There are multiple directories & within those directories i have sub-directories and i want to build a csv file with 2 columns of Directories mapped to their sub-directories. Can you please help me with this. I tried the following code:
Code:
#!/bin/bash
homeDir="$HOME"
ls ~/Parent/ | cut -c1-9 > ~/test_111.csv
while read Child
do
Entry="$(ls $homeDir/Parent/$Child/ABC/XYZ/DEF/PQR)"
echo $Entry
for (( c=1; c<=5; c++ ))
do
sed -i ci"$Entry" test_222.csv
done
done < test_111.csv
Basically i want two columns of csv file , First column should have Child name & Second cloumn should have Sub-Directory name inside PQR Directory.
Any help will be useful on this.
Thanks in Advance!
Best Regards,
Vijay Bhatia
Hello Folks.
I'm searching for a easy way to rename multiple files from CLI but didn't find any easy way for me so I'm reaching out to you guys for help.
This is what I want to do (from CLII or script). I want to move files with a sequence number on the name of the files (msg0000, msg0001, msg0002 and so on) to let's say msg0066, msg0067 and so on. Each of this file name has two other files (msg0000.wav, msg0000.WAV and msg0000.txt).
The idea is to move them from one directory to another and following a sequence in the file names, is there a way I can do this pain free?
Any help on this matter will be greatly appreciates and I'm talking about over 100 files I need to move following the sequence of the receiving directory.
Thanks!
Hi all,
I've knowledge about timestamp and i'm trying to use it in a particular scenario. I've multiple folders inside which are different files. Now I'm trying to copy one file (say xyz) which is present in all the folders but has variation in it's content and time of creation into a let's say foldernew.
I'm trying to do this by copying the file xyz from each folders with the new name xyz_(it's orginal timestamp) into folder new.
Can this be done with a single command or do what should I write in a script to execute this?
Note: I want to add the timestamp of xyz when it is created not of the time of copy.
I want to search 2 strings A nd B in a file,both present in different rows.If both are found I must get output as both the strings,otherwise blank output.I want to use awk here.
File contains:
A
C
D
B
X
Y
Desired output:
A
B
I am not getting the desired output using,
awk '/"A" && "B"/{print}' file
iam trying to open a 100mb fasta file in libre calc, and it stays like that for several minutes and finally displays that the file exceeded the row limit in excel , is there any way for me to view this file in excel
Hi all,
I am trying to use sed to rename multiple files under multiple directory.
so lets say for exmaple:
Under /root I have 2 directory as follows:
# ll {test1,test2}
test1:
total 0
-rw-r--r--. 1 root root 0 Apr 10 19:16 authkey.apollo
test2:
total 0
-rw-r--r--. 1 root root 0 Apr 10 19:16 authkeys.apollo
if I want to change apollo to jupiter then used this:
for i in `ls {test1,test2} | grep -i 'apollo'`; do echo $i; sed -i 's/apollo/jupiter/g'; done
but it seems like it got missed on the file path in sed. is there any other easier way or better approach to make this work?
Thanks in advance.