[comp.unix.questions] redirected output is buffered, how do you flush it?

rhoward@msd.gatech.edu (Robert L. Howard) (02/05/91)

I have a script that after several pipe stops outputs a line
of information.  When run from a tty it outputs a line every
few seconds to several minutes.  Wrapped around the commands
is an infinite loop.  I can specify a trap command that will
close up things gracefully (print summary info etc.).

The problem comes in when I run:

% script > some_file

and then kill it some number of minutes later.  The total output
of the script is still in some buffer somewhere and doesn't make
it to the file.  Is there some command I can put in the 'trap' to
force it to flush the buffers?  Or is there a recommended way to
kill the job (other than ^C) that will force the buffers to flush?

Here is the script if you are wondering what I am talking about...

------------------------------------------------------------------
#! /bin/sh
#
pstat=/usr/etc/pstat


trap <Some-command_here> 1 2 3 14 15

echo "Starting at `date`"
echo "Interval is $1 seconds."
echo ""

while true
do
    $pstat -T | sed -e 's/\//\ /g'
    sleep $1
done | awk '
/files/	{ if ($1 > files) {
	files = $1
	printf ("max files\t%5d out of %5d, or %6.2f%\n", \
		files, $2, 100*files/$2) }
    }' -
----------------------------------------------------------------

Thanks

Robert
--
| Robert L. Howard             |    Georgia Tech Research Institute     |
| rhoward@msd.gatech.edu       |    MATD Laboratory                     |
| (404) 528-7165               |    Atlanta, Georgia  30332             |
-------------------------------------------------------------------------
|     "Reality is showing us that perhaps we should do with nuclear     |
|      power the same thing Keloggs is advocating for Corn Flakes -     |
|      Discover it again for the first time." -- John De Armond         |

tchrist@convex.COM (Tom Christiansen) (02/05/91)

From the keyboard of rhoward@msd.gatech.edu (Robert L. Howard):
:I have a script that after several pipe stops outputs a line
:of information.  When run from a tty it outputs a line every
:few seconds to several minutes.  Wrapped around the commands
:is an infinite loop.  I can specify a trap command that will
:close up things gracefully (print summary info etc.).
:
:The problem comes in when I run:
:
:% script > some_file
:
:and then kill it some number of minutes later.  The total output
:of the script is still in some buffer somewhere and doesn't make
:it to the file.  Is there some command I can put in the 'trap' to
:force it to flush the buffers?  Or is there a recommended way to
:kill the job (other than ^C) that will force the buffers to flush?

This is a general problem that comes up often, and I don't know any
way of doing it unless you can get the program doing the writes to 
flush its buffers now and then.  If awk only had a way to force flushes,
then it wouldn't be so rough; sadly, it doesn't.

:Here is the script if you are wondering what I am talking about...
:
:------------------------------------------------------------------
:#! /bin/sh
:#
:pstat=/usr/etc/pstat
:
:
:trap <Some-command_here> 1 2 3 14 15
:
:echo "Starting at `date`"
:echo "Interval is $1 seconds."
:echo ""
:
:while true
:do
:    $pstat -T | sed -e 's/\//\ /g'
:    sleep $1
:done | awk '
:/files/	{ if ($1 > files) {
:	files = $1
:	printf ("max files\t%5d out of %5d, or %6.2f%\n", \
:		files, $2, 100*files/$2) }
:    }' -

Here's a fairly direct translation of your program into perl, 
which does have a way to set buffering: if $| is non-zero, then
output will be flushed after each print.  It even looks a lot
like your script.  

    #!/usr/bin/perl
    $pstat='/usr/etc/pstat';
    $nap = shift; 		# i like to name my args
    $| = 1; 			# auto-flush at end of print statements
    print "Starting at ", `date`;
    print "Interval is $nap seconds.\n\n";
    while (1) {
	`$pstat -T` =~ /(\d+)\/(\d+) files/; # $1 and $2 get set here
	if ($1 > $files) {
	    $files = $1;
	    printf ("max files\t%5d out of %5d, or %6.2f%%\n", 
		    $files, $2, 100*$files/$2);
	}
	sleep $nap;
    }

--tom
--
"Still waiting to read alt.fan.dan-bernstein using DBWM, Dan's own AI
window manager, which argues with you 10 weeks before resizing your window." 
### And now for the question of the month:  How do you spell relief?   Answer:
U=brnstnd@kramden.acf.nyu.edu; echo "/From: $U/h:j" >>~/News/KILL; expire -f $U

brnstnd@kramden.acf.nyu.edu (Dan Bernstein) (02/06/91)

In article <1991Feb05.000629.7401@convex.com> tchrist@convex.COM (Tom Christiansen) writes:
> This is a general problem that comes up often, and I don't know any
> way of doing it unless you can get the program doing the writes to 
> flush its buffers now and then.

Here's a very easy general solution: Run % pty script > some_file
rather than % script > some_file. This will work for any program that
uses stdio normally. pty appeared in comp.sources.unix volume 23 and is
available via anonymous ftp to 128.122.128.22 in pub/hier/pty/*. It
works on most BSD systems. The original poster should check around at
gatech for a copy.

---Dan
Stupidity, n.: An overwhelming desire to rewrite one-line shell scripts
as 36-line Perl scripts so they run 6% faster. See Christiansen, Tommy.

jde@uwbln.uniware.de (Jutta Degener) (02/10/91)

Robert L. Howard asks:
:I have a script that after several pipe stops outputs a line
:of information. [...]
:
:The problem comes in when I run:
:
:% script > some_file
:
:and then kill it some number of minutes later.  The total output
:of the script is still in some buffer somewhere and doesn't make
:it to the file.  Is there some command I can put in the 'trap' to
:force it to flush the buffers?  Or is there a recommended way to
:kill the job (other than ^C) that will force the buffers to flush?

Tom Christiansen answers:
> Here's a fairly direct translation of your program into perl, 

To which Dan Bernstein replies:
> Here's a very easy general solution: Run % pty script > some_file

When you hit '^C', both the shell and its subprocess, awk, are killed.
Unfortunately, as Tom already mentioned, awk doesn't flush its buffers.

trap "" 1 2 3 .. etc will ignore signals for both a shell and its 
subprocesses. (On the systems I checked.)  Try:

trap "exit 0" 1 2 3 15
{ echo piling lies
  while :
  do
	sleep 1
	echo upon lies
  done } | (
	trap "" 1 2 3 15;
	awk '/lies/{ l++ } END { printf("%d lies successfully piled.\n", l ) }'
)

--
#include <std/disclaimer.h>      		    jutta@tub.cs.tu-berlin.de

brnstnd@kramden.acf.nyu.edu (Dan Bernstein) (02/11/91)

In article <1991Feb10.081651.24841@uwbln.uniware.de> jde@uwbln.uniware.de (Jutta Degener) writes:
> > Here's a very easy general solution: Run % pty script > some_file
> When you hit '^C', both the shell and its subprocess, awk, are killed.
> Unfortunately, as Tom already mentioned, awk doesn't flush its buffers.

Fortunately, a program running under pty sees a terminal for its input
and output, so any program using stdio (including awk) will flush its
buffers just as if you ran it without redirection. That's why it's a
solution.

> trap "" 1 2 3 .. etc will ignore signals for both a shell and its 
> subprocesses. (On the systems I checked.)

That won't solve the poster's problem. He's saying ``output is buffered,
so when I kill awk, I lose output.'' Tom and I are saying ``so don't
buffer your output.'' You're saying ``so don't kill awk.''

---Dan