[svlug] Re: [svlug]how to copy a bunch of "." files?
tin at le.org
Mon May 8 00:27:08 PDT 2000
-----BEGIN PGP SIGNED MESSAGE-----
> I'd tend to agree with you here. The info pages on find(1) seem to
> support this -- and using the exec flag to grep through something
> like the linux source tree takes a lot longer than it does using
> xargs on my system.
> But based on what you said, I'd expect to see a forest of grep
> processes running, one per source file in the linux source tree, all
> looking for something (for the test, I looked in all source files for
> the word 'panic'). I did not notice anything forking new processes,
> despite repeated utterances of 'ps alx'. But it took substantially
> longer, with more think time at the start of the process than with
> an xargs solution.
Modern find serializes -exec in a directory. I've never work on find
source so don't know the exact alg used, but I distinctly remember
shooting myself in the foot ;-) with "find -exec" on an early *NIX
system about 15 years ago. I think it might have been System III.
That particular find command tried to "optimize" its work by creating
multiple childprocs to handle subdirs (one per subdir), and I had a
fairly deep and wide source tree, which quickly overwhelmed the poor
3B2 I was on. Pissed off a lot of people in my dept...
That 3B2 had all of 1MB of RAM and a 10MB drive, shared by 30 people.
> Furthermore, the xargs gives the data in a better presentation: the
> find -exec doesn't give the filenames, but the xargs does (using
> find . -name "*.[ch] | xargs grep 'panic' -).
You could use -print with find to get filenames. Find is kind of like
the kitchen sink command, many options and lots of things you can use it
Internet Security and Firewall Consulting
Tin Le - tin at le.org
-----BEGIN PGP SIGNATURE-----
-----END PGP SIGNATURE-----
More information about the svlug