This is the mail archive of the
mailing list for the Cygwin project.
RE: Cat and Head Problems with Binary Files
- To: <firstname.lastname@example.org>
- Subject: RE: Cat and Head Problems with Binary Files
- From: "Jeffry T Ross" <email@example.com>
- Date: Fri, 23 Jul 1999 12:35:29 -0400
- Reply-To: <firstname.lastname@example.org>
I thought the suggestion to use dd was a good, although a bit excessive
when one only wants to move bytes from a file to a pipe without any
conversion. So I tried it:
dd if=test1 > test2
test1 was 1296 bytes and test2 became 1301 bytes.
So then I tried dd directly to a file.
dd if=test1 of=test2
this worked fine.
So now I don't suspect cat but the piping and redirection.
The question is how does cygwin implement it's piping and redirection?
Is this problem imposed by the underlying WIN/DOS or is it a bug in BASH?
The pipe shouldn't be trying to interpret the data going through it.
That's the job of the process at the end of the pipe, be it the terminal
or another program.
From: Randall Schulz [mailto:email@example.com]
Sent: Thursday, July 22, 1999 9:24 PM
To: firstname.lastname@example.org; email@example.com
Subject: RE: Cat and Head Problems with Binary Files
Who ever said head and cat were the way to deal with a problem like this?
Head may have a -c option, but it's indended use is to capture a certain
number of *lines* of the beginning of a file.
For what you want to do, you should try dd. It's tailor made for this sort
of thing. I knows not of lines, but every other manner of file copying
based on regular file subdivisions is its forte. It can also perform some
transformations such as case conversion, character code translation, byte
Check it out!
Palo Alto, CA USA
At 05:49 PM 7/22/99 , Jeffry T Ross wrote:
>Who ever said that cat and head are textutils?
>On the Unix side of the world they're commonly used
>on binary files. If you have a 10gig file of binary
>data, what the easiest way to get a 10k chunk?
>How about: head -c 10000 bigfile > littlefile
>This works in Unix because Unix thinks all files are
>binary, and that's because all files are binary. The
>notion of text files is a bogus limitation imposed by
>Is there a reason why having cat treat all files as binary
>would cause erroneous performance when cat was used on a
>file you'd consider to be text?
Want to unsubscribe from this list?
Send a message to firstname.lastname@example.org