How To Create Large Size File With Some Random Characters And Numbers?

Can anybody tell me how to create large size file in linux? I am using truncate but it is not creating files with some random characters ad numbers.
Please suggest me commands with syntax and example to create large size file with some random characters and numbers in linux with proper explanation.
Thank You.


Similar Content



Create A Large File In Linux

I want to create a large file size lets says 600G. I used command
# fallocate -l 600G file01
Although I got the file of my size but the things is, there is no data. I want to create a file with size 600G and random data in it.

Any suggestion?

Diffing The Line Numbers

hi guys

i am trying to find the "size" of a "block" of data in LARGE data files, the example below test_data.txt is very simplified. by "size" i mean the difference in line numbers of a block, and the "size" will be constant throughout the file so

1234 6.600000 4321
1234 8.500000 4321
1234 1.800000 4321
1234 2.300000 4321
1234 8.500000 4321
1234 2.800000 4321

if i define a block as whenever i find 8.500000 in the second column, then in the example the the block size would be 3 becasue 8.500000 occurs on the 5th line and on the 2nd. right now i am using

Code:
 grep -n "8.500000" test_data.txt | cut -f1 -d:

and/or

Code:
 awk '/8.500000/ {print FNR}' test_data.txt

obviously i don't remeber how to tag text as code?

btw, the grep command is much much faster

both of these commands give an entire list (long list of number for files greater than a gig) of line numbers which i then have to subtract one from another to come up with 3 in the example. not that i'm opposed to doing math, but i would think awk or grep should be able to do this for me

ideas?

tabby

Help With Applying Passing Parameters

i need to complete this exercise but my code has some issues
HERE is the PRoblem:
Create a script that can accept ANY amount of numbers from the command line. Process the numbers one at a time, where numbers greater than 10 print “large”, numbers less than or equal to 10 print “small”
E.g. process 5 10 15 would print
small
small
large

and here is my code so far
if [ $@ -le "10" ]
then
echo "smaller"
else
echo "bigger"
shift
fi
if [ $@ -le "10" ]
then
echo "smaller"
else
echo "bigger"
shift
fi
if [ $@ -le "10" ]
then
echo "smaller"
else
echo "bigger"
shift
fi
if [ $@ -le "10" ]
then
echo "smaller"
else
echo "bigger"
shift
fi

any help would be greatly appreciated

How To Shrink A Large Audio Ogg File

I would like to learn how to create a zip file. The word zip file is a Windows/Microsoft word. I do not know what word is used in Linux OS.
What I am trying to do is send a copy of an internet radio show. The show is two hours long. When I try to attach the audio file to an email. I am told the file is too large.
I have permission from the host of the radio show to make a copy and send it to a friend.

Slackware 14.1/adduser Command/session Files

Hello All,

Under root we create a user named "template" using the adduser command which is used to setup a kde GUI logon screen for all users. When a user's name is initially used to sign on we get a window on the GUI that states "Run as Template. The action you requested needs additional privileges. Please enter password for template." (Note: No password for template is used.) There are four files that have been observed:
/usr/bin/hp-systray-session (followed by numbers)
/usr/bin/akonadi_agent_launcher-session (follwed by numbers)
/usr/bin/khelpcenter-session (followed by numbers)
/usr/bin/nepomukcontroller-session (followed by numbers).

When we logout then log back in,this window does not appear again on the GUI.

Any help for this one?

Thanks.

Regards,

Jeff

Remove Everything Before In Txt File?

So,

hopefully this is the right section to ask this in!

I'm looking to remove everything before in a txt file so long story short i've made a number wordlist with Crunch im basically wanting to remove all the unless numbers to numbers i want to start with.

so everything before 1232120000 i want to remove from the txt file i've heard you can do it with sed or awk im not familiar with using sed or awk.

Crontab Random Delay Not Working

I am trying to delay the daily reboot of multiple Linux machines by a random time, within one hour.

This is to avoid a simultaneous reboot of all the Linux machines at the same time.

I also want to avoid specifying a time in cron. I want it to be completely random between 00:00 and 01:00.

So far I tried the commands below, but no luck. The machines still reboot at midnight.

Code:
@daily /bin/sleep $((RANDOM\%3600)) && /sbin/reboot
@daily /bin/sleep $(/usr/bin/expr $RANDOM \% 3600); /bin/reboot

Removing Multiple Lines From Cell Data In A .csv File

I am trying to process some .csv files with Linux as follows:

Some fields have data with newline characters embedded, like so:

"Bob Smith
531 Pennsylvania Avenue
Washington, DC"

(I verified the existence of the " via Wordpad. The file is too large to easily edit in Wordpad to get all the data for each row on a single line).

what linux command would I use on the files to get the data in each cell on one line?

I have tried:

1. awk -v RS="" '{gsub (/\n/,"")}1' file > newfile

but the cell data was still being read in as if "531 Pennsylvania Avenue" was a brand new row in the CSV file.

2. Command 1 followed by awk -v RS="" '{gsub (/\r/,"")}1' newfile > finalFile

but that resulted in all of the data in the file being put onto a single line.

3. awk -v RS="" '{gsub (/\r\n/,"")}1' file > newFile

But that result was the same as attempt number 2.

How can I preprocess the file so that:

"Bob Smith
531 Pennsylvania Avenue
Washington, DC"

is read as a single field on a single line as part of the row it should be associated with, like

"Bob Smith 531 Pennsylvania Avenue Washington, DC"

Splitting Up A Csv File Into Lines

Hi all,
I have a large csv file which linux sees large amounts of it as one line. I've been trying to break it up with sed, from..
(name,address,etc),(name,address,etc),(name,address,etc) into

(name,address,etc)
(name,address,etc)
(name,address,etc)

each on their own line.. can this be done with sed? And if not what other commands can I use to do it..

Thanks
pulsar..

Creating A Large Tar Ile Form A Sequence Of Small Tar Files And One Files Is Missing

I am trying to put back together a big atr file from some smaller tar files that I created several years ago. The issue is that in order to tar this large file, I must put each file back using the command

tar -xMf cd-1.tar
Prepare volume #2 for 'cd-1.tar' and hit return:n cd-2.tar
Prepare volume #3 for 'cd-2.tar' and hit return:n cd-3.tar
and so forth.

I have fourteen files cd-1.tar through cd-15.tar. The cd-9.tar files is missing and I assume that it is gone. Now when I type the commands in I get the following:

Code:
-linux tarfile]$ tar -xMf cd-1.tar
Prepare volume #2 for `cd-1.tar' and hit return: n cd-2.tar
Prepare volume #3 for `cd-2.tar' and hit return: n cd-3.tar
Prepare volume #4 for `cd-3.tar' and hit return: n cd-4.tar
Prepare volume #5 for `cd-4.tar' and hit return: n cd-5.tar
Prepare volume #6 for `cd-5.tar' and hit return: n cd-6.tar
Prepare volume #7 for `cd-6.tar' and hit return: n cd-7.tar
Prepare volume #8 for `cd-7.tar' and hit return: n cd-8.tar
Prepare volume #9 for `cd-8.tar' and hit return: n cd-10.tar
tar: This volume is out of sequence (10755138772 - 4889670868 != 6598651392)
Prepare volume #9 for `cd-10.tar' and hit return: n cd-10.tar
tar: This volume is out of sequence (10755138772 - 4889670868 != 6598651392)
Prepare volume #9 for `cd-10.tar' and hit return: 
tar: This volume is out of sequence (10755138772 - 4889670868 != 6598651392)

As you can see I do not have cd-9.tar. That stops the untarring cold. However, I have cd-10.tar,cd-11.tar,cd-12.tar,cd-13.tar,cd-14.tar,cd-15.tar. Now I may have these files, but they cannot be put back in the main file because cd-9.tar is missing and everything must be put in sequentially.

Is there a way to complete this sequence of steps and add all fourteen files to the files bigbackup leaving out cd-9.tar? That means that the bigbackup file will be incomplete, but that is better than no file or having bigbackup missing six files on the back end.

Any help appreciated.

Thanks in advance.

Respectfully,


Newport_j