[svlug] Running a command n times simutaneously

Joe Buck Joe.Buck at synopsys.COM
Tue Apr 25 11:12:10 PDT 2006


On Tue, Apr 25, 2006 at 09:23:22AM -0700, Rufoo wrote:
> > For a performance measurement task I need to run a
> > command n>1 times simultaneously. Is there any tool
> > that can help me do this? 

On Tue, Apr 25, 2006 at 10:58:13AM -0700, Andrew Stitt wrote:
> You can use the '&' shell operator to run a command in the background
> and use it in a loop to run the command in the background n times.

But that doesn't provide a clean way of timing when the processes have
all completed.

You can write a Perl script to manage the multiple processes.  The
three magic commands you need are "fork", "exec", and "wait".

You launch your command with a sequence like

if (($pid = fork) == 0) {
   # Child, run the command
   exec "command arg1 arg2 ...";
}
else {
   # We are the parent, $pid holds the process ID of the child
   # do any bookkeeping here (if you want to build a table of the procs)
}

If you want to launch n copies in parallel, just execute the above
code n times, it will fork off n copies.

To wait for the jobs to complete, you use the wait statement:

$waitpid = wait;

waits for a child process to complete, and returns its process ID.
It returns -1 if there are no more children.  Don't worry, you won't
miss any, as the processes hang around until they successfully report
their status.  After you execute the above statement n times, you
can report the wall-clock time for running n copies of the command.

Python, Ruby, etc. have similarly easy-to-use process management
facilities.






More information about the svlug mailing list