shell cmds crapping out with large numbers of files
Bruce Dobrin
dobrin@imageworks.com
Tue May 25 23:48:00 GMT 2004
{uname -a
CYGWIN_NT-5.1 THEODOLITE 1.5.9(0.112/4/2) 2004-03-18 23:05 i686 unknown
unknown Cygwin
}
I need to process very large numbers ( up to 100,000) of imagefiles. I
noticed my foreach loops start crapping out when the number of files grows
near 1500. It feels like a 32bit memory addressing problem to me, but I
don't know how to check for that. I wrote a foreach loop to generate files
(0 to xxxx) and then list them and it died at 1471
here is an example of the problem:
dobrin@THEODOLITE:/home/dobrin/longtest> ls flern* | wc
1471 1471 32726
dobrin@THEODOLITE:/home/dobrin/longtest> touch flern0001471.plern.poo
dobrin@THEODOLITE:/home/dobrin/longtest> ls flern* | wc
2 [main] -tcsh 2396 cmalloc: cmalloc returned NULL
0 0 0
Segmentation fault (core dumped)
dobrin@THEODOLITE:/home/dobrin/longtest> rm flern0001471.plern.poo
dobrin@THEODOLITE:/home/dobrin/longtest> ls flern* | wc
1471 1471 32726
I Currently am processing the files in batches of 1000 to avoid the problem.
I tried the same thing on my linux box and it works fine.
Thankyou
--
Unsubscribe info: http://cygwin.com/ml/#unsubscribe-simple
Problem reports: http://cygwin.com/problems.html
Documentation: http://cygwin.com/docs.html
FAQ: http://cygwin.com/faq/
More information about the Cygwin
mailing list