[comp.lang.perl] catching stuff from shell commands

df@sei.cmu.edu (Dan Farmer) (12/18/90)

  I'm trying to run a program (via system or otherwise) and catch the stdout
and stderr.  Throwing it to a file and reading it there is easy enough, but
I'm looping through this command lots of times, and I'd like to be able to
redirect it to a variable.  Any ideas?

 Tnx --

 -- dan
    df@death.cert.sei.cmu.edu

tchrist@convex.COM (Tom Christiansen) (12/18/90)

From the keyboard of df@sei.cmu.edu (Dan Farmer):
:  I'm trying to run a program (via system or otherwise) and catch the stdout
:and stderr.  Throwing it to a file and reading it there is easy enough, but
:I'm looping through this command lots of times, and I'd like to be able to
:redirect it to a variable.  Any ideas?

I'm not sure how much you are really looking for, but here are 
some ideas, ranging from the simple to the sublime.

    for $arg ( @list ) {
	$output .= `somecmd $arg 2>&1`;
    } 

That will run the same command but with a different argument each time.
Stderr will be dup'd (redirected into, mixed in with) to stdout.

Now, if you want to process a line at a time, you can do this:

    for $arg (@list) {
	open (CMD, "cmd $arg 2>&1 |");
	while (<CMD) {
	    
	} 
    } 

This is also going to save memory if there's a lot of output
from the command.  

Now, I think maybe one of these should answer your question, but
just in case, let's go deeper.

Perhaps you say, "I would like to read stdout and stderr separately.
I don't want them both going into the same place."  (I got mail on this
question today.)

Well, I'm a little bit nervous about deadlock.  Consider this:

    while ( <HIS_STDERR> ) {
    }
    while ( <HIS_STDOUT> ) {
    } 

This can easily lead to deadlock if he's got a bunch of stdout and very
little stderr.  If you reverse the order, it might get better for one
program, worse for the next.  The really right thing to do is to use
select() to know WHEN to read from which pipe.  You'll have to setup two
pipes (three if you must feed input, too, but that again risks deadlock)
the select masks using vec(), and then do your own fork and exec.  This I
leave as an exercise for the reader. :-)

But if you're willing to send one to a file, then you can read the other
and not worry, nor do you have to go through any really funky contortions
involving pipes and execs (although you certainly may if you really want
to.)   Sending just stderr to a file is easy:

    open (CMD, "a_cmd its_args 2>kid.out |");

Now you can read stdout.  

But what if you want to put stdout to a file and read from stderr?  The
trick is to put stdout to a file and then swap around stderr into stdout's
slot so you can get at it.  We'll use a temporary descriptor while we pull
the switch, then close it when we're done just to be tidy (and paranoid):

    open (CMD, "a_cmd its_args 3>kid.out 2>&1 1>&3 3>&- |");
    while (<CMD>) {
	# just got a line of his stderr
    } 
    close CMD;
    if ($?) { # command failed

That's put the command's stdout into kid.out and you get to read its
stderr.  This is an excellent reason why it would be Evil if perl were to
use the csh for its system, popen|, or backticks (despite the pleas of
certain users of mine).  For the <*> stuff, it does make sense -- until it
should be built-in, that is.

--tom
--
Tom Christiansen		tchrist@convex.com	convex!tchrist
"With a kernel dive, all things are possible, but it sure makes it hard
 to look at yourself in the mirror the next morning."  -me