Unacceptable behavior -- slowing down script execution

SJ Wright sjwright68@charter.net
Mon Sep 20 20:58:00 GMT 2010


mike marchywka wrote:
> On 9/17/10, SJ Wright <sjwright68 charter.net> wrote:
>   
>> 4. Is it normal for any script to run CPU usage up to 100%?
>>     
>
> Unless it is blocking for something like IO including VM swaps, why not?
>
>
>   
>> Regarding #4:
>> I have a script that I ran in GNOME Terminal less than an hour ago. I
>> "time"d it -- the return was 20.6 seconds on the first line (real?). I
>> ran the same script fifteen minutes later, evaluating identical files of
>> the same type, length (5.37kb and 345b ASCII text) and time stamp, and
>> after 7 minutes it was barely one-eighth complete. That's when I checked
>> Task Manager and found my CPU usage was at 100% and three bash.exe's
>> were running simultaneously. Admittedly the script calls on several
>>     
>
> This sounds like a threading problem if I had to guess but it could
> be anything that changed between runs- certainly timing will make
> these things come and go but having no idea what you mean by identical
> instead of  "the same" there are a lot of things that could have changed etc.
> What did it seem to be trying to do? Often in cases like this
> the alarming situation is where your cpu usage drops to low values
> and your disk light gets stuck on as you have depleted memory.
>
> I guess I'm also not sure what you think a usual or good number
> of processes should be etc. A number of anecdotes have been
> reported about slower performance on 64 bit or multi core machines
> than  more primitive older cmoputers and it is easy to
> speculate on reasons why that could happen
> but hard to make a clear diagnosis without an explicit test case.
>
> --
> Problem reports:       http://cygwin.com/problems.html
> FAQ:                   http://cygwin.com/faq/
> Documentation:         http://cygwin.com/docs.html
> Unsubscribe info:      http://cygwin.com/ml/#unsubscribe-simple
>
>
>   
They were the same files, in terms of number of lines, bytes per line, 
and so forth. Likely the only differences were to be found in their 
timestamps.
This one script seems, now that I've got rid of bash-completion (thanks 
again Cyrille), to be the only one that will push CPU usage to 100% and 
keep it there for the duration of its execution.

Call me strange, but I like my Task Manager to show me no more than 51 
processes at a time. I also like my scripts to call up one and only one 
additional instance of the shell they're written to be executed in than 
I'm running before I start one.

I agree it's likely a threading problem, but I think it's more likely 
how I have the script written.

A breakdown of the calls to non-internal (not BASH built-ins, so far as 
I can judge) would look like this:
 
  -- 8 to Exiv2 v.0.20 by Andreas Huggel.

Likely there are more; I may be wrong. I don't normally keep track of 
which command came from where or calls on what class of "ps"- able 
process. Perhaps, as this has involved and included more than two shells 
running simultaneously (by visual evidence in Task Manager and nothing 
else), it is, instead, a matter of a too-high demand *on* the internals. 
Or I've got another software package, either Cygwin-adapted or just a 
lucky compile on my part, that's kicking in and making  itself known 
when I launch this one particular script.

I can post the script to the list as text if anyone thinks it's worth 
pursuing this line of logic.

Thanks for the replies and the help thus far. Looking forward to the 
same continuing.

Steve Wright

--
Problem reports:       http://cygwin.com/problems.html
FAQ:                   http://cygwin.com/faq/
Documentation:         http://cygwin.com/docs.html
Unsubscribe info:      http://cygwin.com/ml/#unsubscribe-simple



More information about the Cygwin mailing list