haynes@ucscc.UCSC.EDU (Jim Haynes) (05/12/89)
We just discovered on our system, which started life as 3BSD and has been upgraded over the years to 4.3 (but not tahoe), that there are a /usr/ucb/grep and a /bin/grep. The /bin/grep seems to have powers that the others don't, such as working with the expression '\(.*\)\1' A 4.3 tahoe system seems to have only the /usr/ucb/grep, which does not work with that expression. Does anybody remember how we got into this mess and why? haynes@ucscc.ucsc.edu haynes@ucscc.bitnet ..ucbvax!ucscc!haynes "Any clod can have the facts, but having opinions is an Art." Charles McCabe, San Francisco Chronicle
zuro%nrl.decnet@ccf3.nrl.navy.mil (NRL::ZURO) (06/13/89)
grep - global regular expression print
) (06/15/89)
I believe that grep translates to "global regular expression print".
MSTACC%RITVAX.BITNET@cornellc.cit.cornell.ed (Mark S. Tremblay) (06/16/89)
>Subject: grep > > I believe that grep translates to "global regular expression print". Enough with the grep translations! I think we get the idea.
runyan@hpirs.HP.COM (Mark Runyan) (06/16/89)
>/ zuro@ccf3.nrl.navy.mil / 4:44 am Jun 13, 1989 / > > I believe that grep translates to "global regular expression print". From _The_Unix_Programming_Environment_ by Brian Kernighan and Rob Pike: p 18 "...The name cames from the ed command g/regular-expression/p,..." This may not be the right answer, but it *is* documented. :-)
mikey@ontek.UUCP (Mike Lee) (06/17/89)
I have just read the 9,000th explanation of what grep stands for. If I see one more I will flame the poster to a crisp over an open pit barbecue and eat them for dinner tonight. I hope that qualifies as mindless cannibalism. Please note how nicely this paragraph has been adjusted. Mike ("Please, call me Bruce, I insist") Lee Ontek Corporation ("Better living through programming") mikey@ontek.uucp ...uunet!ontek!mikey 1-714-768-0301
kim@kim.misemi (Kim Letkeman) (06/20/89)
In article <4750023@hpirs.HP.COM>, runyan@hpirs.HP.COM (Mark Runyan) writes: > >/ zuro@ccf3.nrl.navy.mil / 4:44 am Jun 13, 1989 / > > > > I believe that grep translates to "global regular expression print". > > From _The_Unix_Programming_Environment_ by Brian Kernighan and Rob Pike: > > p 18 > > "...The name cames from the ed command g/regular-expression/p,..." > > This may not be the right answer, but it *is* documented. :-) You're both right, since "g/regular-expression/p" is the ed command to do a global search for a regular expression and print each line that matches. The short way of saying this is "global regular expression print". Surely there can't be many more catty ways of showing everyone how bright we are? Kim ...!uunet!mitel!spock!kim
hovanes@db.UUCP (Kenneth Hovanes) (06/23/89)
In article <299@ontek.UUCP>, mikey@ontek.UUCP (Mike Lee) writes: > > I have just read the 9,000th explanation of what grep stands for. > If I see one more I will flame the poster to a crisp over an open > pit BBQ. This is all fine and dandy but what does 'grep' stand for? Yuk, Yuk, Yuk.........
mikey@ontek.UUCP (Mike Lee) (06/30/89)
In article <224@db.UUCP>, hovanes@db.UUCP (Kenneth Hovanes) writes: > In article <299@ontek.UUCP>, mikey@ontek.UUCP (Mike Lee) writes: > > > > I have just read the 9,000th explanation of what grep stands for. > > If I see one more I will flame the poster to a crisp over an open > > pit BBQ. > > This is all fine and dandy but what does 'grep' stand for? > > Yuk, Yuk, Yuk......... You have your choice: Grill Redundant Explanation Posters Gouge Repeaters Eyes Please GREP Really Entertains People Get Ready for Exceptional Pyrotechnics Please note how poorly this paragraph has been adj- usted. Yum, Yum, Yum........ Mike ("I would like... a shubbery") Lee mikey@ontek.uucp ...uunet!ontek!mikey
athalyes@lafcol.UUCP (Athalye Salil ) (07/11/89)
I always thought that GREP stood for: Global Regular Expression Print Although I have no idea what that is supposed to mean. Hope it helps! -Salil Athalye Salil Athalye (UNIX Neophyte!) Lafayette College Easton Pa 18042 athalyes@lafcol When it comes to UNIX,'ignorance is bliss'!
bobmon@iuvax.cs.indiana.edu (RAMontante) (07/13/89)
- I always thought that GREP stood for: - Global Regular Expression Print -Although I have no idea what that is supposed to mean. Hope it helps! - -Salil Athalye - -Salil Athalye (UNIX Neophyte!) -athalyes@lafcol It stands for "_GUnix _RQuestions _EFrom _PHELL!" (the `_G', `_R', `_E', and `_P' are all silent). -When it comes to UNIX,'ignorance is bliss'! ...I think I've found your problem --- too much bliss...
bph@buengc.BU.EDU (Blair P. Houghton) (07/22/89)
In article <299@ontek.UUCP> mikey@ontek.UUCP (Mike Lee) writes: > > >I have just read the 9,000th explanation of what grep stands for. >If I see one more I will flame the poster to a crisp over an open >pit barbecue and eat them for dinner tonight. I hope that >qualifies as mindless cannibalism. Please note how nicely this >paragraph has been adjusted. God Reads Ezra Pound. - - - - --Blair "Bait? Me, I'm _just_your_fish_. I like me medium rare. (Please note how nicely this sarcastic witticism has been adjusted. :)"
jms@hcx.uucp (Michael Stanley) (07/23/89)
In article <4750023@hpirs.HP.COM>, runyan@hpirs.HP.COM (Mark Runyan) writes: > > From _The_Unix_Programming_Environment_ by Brian Kernighan and Rob Pike: > > p 18 > > "...The name cames from the ed command g/regular-expression/p,..." > > This may not be the right answer, but it *is* documented. :-) I submitted this response a week and a half ago with the comment that I had read it in an old unix manual. I asked for confirmation, but apparently 'those who know' and keep telling us not to be foolish with guesses either didn't deign to let us in on the answer, or they themselves don't really know. Personally, I agree with this answer. It now has the credit of being documented in *two* places. Michael Stanley jms@hcx.uucp
rjshaw@ramius.llnl.gov (Robert Shaw) (10/23/90)
Question from Len Teifel: | | I have a main directory with hundreds of subdirectories, | and I want to find a file with a particular string, say "xyz" | The grep command only works in one directory at a time. Is there | a way of searching my whole directory structure to find a file | with a particular string? | try find ~ -name \*xyz\* -print Or, not quite as nice: ls -R ~ | grep xyz =============================================================================== rjshaw@ramius.llnl.gov _____ ____ ____ ______ R o b / / / / / / / / / / / -------- / --/ / / / / / / / / / --------------------------- /-- / / / / / / / / S h a w /____/ /_/_/ /_/_/ /_____/ The Cosby's are precisely what's wrong with television today... ===============================================================================
rjshaw@ramius.llnl.gov (Robert Shaw) (10/23/90)
OOPS! I just realize that I misunderstood the question: | | I have a main directory with hundreds of subdirectories, | and I want to find a file with a particular string, say "xyz" | The grep command only works in one directory at a time. Is there | a way of searching my whole directory structure to find a file | with a particular string? | try: find ~ -exec grep xyz {} \; -print My apologizes... =============================================================================== rjshaw@ramius.llnl.gov _____ ____ ____ ______ R o b / / / / / / / / / / / -------- / --/ / / / / / / / / / --------------------------- /-- / / / / / / / / S h a w /____/ /_/_/ /_/_/ /_____/ The Cosby's are precisely what's wrong with television today... ===============================================================================
tiefel@sunshine.Kodak.COM (Lenny Tiefel) (10/23/90)
I have a main directory with hundreds of subdirectories, and I want to find a file with a particular string, say "xyz" The grep command only works in one directory at a time. Is there a way of searching my whole directory structure to find a file with a particular string? Thanks. -- Len Tiefel tiefel@kodak.com
max@lgc.com (Max Heffler) (10/23/90)
In article <1990Oct23.123025.18012@kodak.kodak.com> tiefel@sunshine.Kodak.COM (Lenny Tiefel) writes: >I have a main directory with hundreds of subdirectories, >and I want to find a file with a particular string, say "xyz" >The grep command only works in one directory at a time. Is there >a way of searching my whole directory structure to find a file >with a particular string? > Try this: cd x where x is head of tree to search find . -type f -exec grep xyz {} /dev/null \; | tee $HOME/xyz.out The cd is done separately because I have had problems with symbolic links on some machines. This will search files only (type -f) and if your grep supports a caseless option (grep -i) you might want to use it, in some cases. The /dev/null is to cause grep to report back the filenames where the string is found, since grep will not report the filename for a single argument. Finally, the tee allows you to monitor progress, as well as have a file to reference later. -- Max Heffler internet: max@lgc.com Landmark Graphics Corp. uucp: ..!uunet!lgc!max 333 Cypress Run, Suite 100 phone: (713) 579-4751 Houston, Texas 77094
thomas@uppsala.telesoft.se (Thomas Tornblom) (10/23/90)
In article <1990Oct23.123025.18012@kodak.kodak.com> tiefel@sunshine.Kodak.COM (Lenny Tiefel) writes:
I have a main directory with hundreds of subdirectories,
and I want to find a file with a particular string, say "xyz"
The grep command only works in one directory at a time. Is there
a way of searching my whole directory structure to find a file
with a particular string?
Thanks.
find <dir> -type f -name '*xyz*' -print
Thomas
--
Real life: Thomas Tornblom Email: thomas@uppsala.telesoft.se
Snail mail: Telesoft Uppsala AB Phone: +46 18 189406
Box 1218 Fax: +46 18 132039
S - 751 42 Uppsala, Sweden
lemieux@ireq.hydro.qc.ca (Lemieux) (10/23/90)
In article <1990Oct23.123025.18012@kodak.kodak.com> tiefel@sunshine.Kodak.COM (Lenny Tiefel) writes: >I have a main directory with hundreds of subdirectories, >and I want to find a file with a particular string, say "xyz" >The grep command only works in one directory at a time. Is there >a way of searching my whole directory structure to find a file >with a particular string? > >Thanks. > >-- > >Len Tiefel tiefel@kodak.com Try this: find . -print -name '*xyz*' - Eric ----------------------------------------------------------------------------- Eric LEMIEUX | Internet: lemieux@ireq.hydro.qc.ca Institut de Recherche d'Hydro-Quebec | 1800 Montee Sainte-Julie | TEL: (514) 652-8139 Varennes, Quebec, Canada | FAX: (514) 652-8309 -----------------------------------------------------------------------------
bad@atrain.sw.stratus.com (Bruce Dumes) (10/24/90)
In article <THOMAS.90Oct23163234@uplog.uppsala.telesoft.se> thomas@uppsala.telesoft.se (Thomas Tornblom) writes: >In article <1990Oct23.123025.18012@kodak.kodak.com> tiefel@sunshine.Kodak.COM (Lenny Tiefel) writes: > > > I have a main directory with hundreds of subdirectories, > and I want to find a file with a particular string, say "xyz" > The grep command only works in one directory at a time. Is there > a way of searching my whole directory structure to find a file > with a particular string? > > Thanks. > >find <dir> -type f -name '*xyz*' -print > >Thomas I don't think this is what Lenny was asking for. I think "xyz" is a string *IN* the file, not in the file name. I have a little script called "locate_string". -------------------- cut --------------------------------------------- # # Written by Bruce Dumes Sept 1990 # if test ! "$2" then echo "Usage: locate_string search_dir string" exit fi temp=`find $1 -name "*" -type f -print` grep $2 $temp -------------------- cut --------------------------------------------- Bruce -- Bruce Dumes | "You don't see many of *these* nowdays, | bad@zen.cac.stratus.com | do you?" |
gwyn@smoke.BRL.MIL (Doug Gwyn) (10/24/90)
In article <1990Oct23.123025.18012@kodak.kodak.com> tiefel@sunshine.Kodak.COM (Lenny Tiefel) writes: >I have a main directory with hundreds of subdirectories, >and I want to find a file with a particular string, say "xyz" >The grep command only works in one directory at a time. Is there >a way of searching my whole directory structure to find a file >with a particular string? "grep" doesn't grok directories at all. However, UNIX is a toolkit environment. Whenever you want to execute a command on all files, or some readily selected subset of files, within a directory hierarchy, you should think of "find" and "xargs". (If you don't have "xargs", complain to your vendor.) find root_name -type f -print | xargs grep pattern /dev/null The extra argument to "grep" is to ensure that the filename will be printed for each matching line. (Workaround for a wart in "grep"'s design.)
mperlman@Encore.COM (Mark Perlman) (10/24/90)
>>In article <1990Oct23.123025.18012@kodak.kodak.com> tiefel@sunshine.Kodak.COM (Lenny Tiefel) writes: >> >> >> I have a main directory with hundreds of subdirectories, >> and I want to find a file with a particular string, say "xyz" >> The grep command only works in one directory at a time. Is there >> a way of searching my whole directory structure to find a file >> with a particular string? >> >> Thanks. Here's a little script I use to search for strings. ====================================================== #!/bin/csh -f foreach i ( `find . -type d -print` ) (cd $i;echo "<`pwd`>";grep $1 *[hc]) end ====================================================== I created it specifically because I new I had subdirs and I wasn't sure where the string was that I was looking for. This script assumed I was looking at source code, hence, "grep $1 *[hc]" ^^^^^ You might also want to change the arglist ( $1 ) to ( $* ) to capture any flags you may wish to invoke with grep. -- Mark R. Perlman Independent Consultant 301-206-2016 14014 Oakpointe Dr. mperlman@encore.com Laurel, MD 20707 uunet!gould!mperlman
davidsen@sixhub.UUCP (Wm E. Davidsen Jr) (10/24/90)
Lots of people have given you ideas on this, I'll just add that if you
have a large number of non-text files in this structure, you can save
time by using the "file" command to identify the text files.
Something like:
$ find . -type f -print | xargs file | grep " text" |
> sed 's/:.*$//' | xargs grep PATTERN /dev/null
While this looks like a lot of stuff, it will be vastly faster than
searching all the non-text files and getting possible bizarre matches on
binary code or other strings. I invented this when desparate enough to
look for a string in /usr.
Note: I cross posted this item *only* to the shell group, followups by
default to the original group.
--
bill davidsen - davidsen@sixhub.uucp (uunet!crdgw1!sixhub!davidsen)
sysop *IX BBS and Public Access UNIX
moderator of comp.binaries.ibm.pc and 80386 mailing list
"Stupidity, like virtue, is its own reward" -me
dean@truevision.com (Dean Riddlebarger) (10/24/90)
In article <1990Oct23.143247.5639@lgc.com> max@lgc.com (Max Heffler) writes: >In article <1990Oct23.123025.18012@kodak.kodak.com> tiefel@sunshine.Kodak.COM (Lenny Tiefel) writes: >>I have a main directory with hundreds of subdirectories, >>and I want to find a file with a particular string, say "xyz" >>The grep command only works in one directory at a time. Is there >>a way of searching my whole directory structure to find a file >>with a particular string? >> >Try this: > > cd x where x is head of tree to search > find . -type f -exec grep xyz {} /dev/null \; | tee $HOME/xyz.out > I fire off a global find every night that dumps my login tree to a file. [Nothing sacred about location....I throw mine into $HOME/lib]. I also have a trivial script called 'findit' in my personal bin [$HOME/bin]. This script is nothing more than a grep into the file that the nightly find has created; its only advantage over grep is that I find it mnemonically "pretty". So, assuming I'm not looking for files that have undergone major name changes since the previous evening, I just enter 'findit filename' and get my pattern match very quickly. I like this because a find on a large tree in interactive mode can take quite a loooonnnngg time.......:-) -- <:> Dean Riddlebarger "The bus came by <:> <:> MIS Manager - Truevision, Inc. and I got on, <:> <:> [317] 841-0332 That's when it <:> <:> uucp: uunet!epicb!dean dean@truevision.com all began." <:>
herrage@ntpal.UUCP (Robert Herrage) (10/25/90)
In article <1990Oct23.123025.18012@kodak.kodak.com>, tiefel@sunshine.Kodak.COM (Lenny Tiefel) writes: > I have a main directory with hundreds of subdirectories, > and I want to find a file with a particular string, say "xyz" > The grep command only works in one directory at a time. Is there > a way of searching my whole directory structure to find a file > with a particular string? Here's a nice implementation, appropriately named "rgrep" (Recursive GREP): #!/bin/sh { find . \( -name '*.*' \) -exec grep -l $* {} \; -exec grep -n $* \; -exec echo \; | more; } By getting into the top-most directory and typing rgrep xyz you would get something like this: ./subdir1/file1 136: this line has xyz in it 210: this line also has xyz in it ./subdir2/subsubdir4/file2 12: this line has xyz in it I believe the "grep -l" causes the "./subdir/file1" to be printed and the "grep -n" causes the line numbers to be printed. The "echo", of course, gives you a blank line separation in case the string exists in more than one file. If you want to limit your searches to specific file extensions, you could replace the "\( -name '*.*' \)" with something like \( -name '*.[chCH]' -o -name '*.ec' -o -name '*.txt' \) which means only files with a ".c", ".h", ".C", ".H", ".ec", or ".txt" extension will be searched. Enjoy! Robert (Thanks Dana Cavasso, author!)
perl@step.UUCP (Robert Perlberg) (10/25/90)
In article <1990Oct23.123025.18012@kodak.kodak.com>, tiefel@sunshine.Kodak.COM (Lenny Tiefel) writes: > I have a main directory with hundreds of subdirectories, > and I want to find a file with a particular string, say "xyz" > The grep command only works in one directory at a time. Is there > a way of searching my whole directory structure to find a file > with a particular string? find . ! -type d -print|xargs grep xyz /dev/null Specifying /dev/null as the first file argument to grep deals with the case where xargs passes only one argument to grep. That would cause grep to not display the name of the file if it found the search string in the file. Passing /dev/null to each invocation of grep guarantees that grep will have at least two argument which will cause it to always display file names. Robert Perlberg Dean Witter Reynolds Inc., New York {murphy | philabs | chuo}!step!perl -- "I am not a language ... I am a free man!"
pa1@tdatirv.UUCP (Pat Alvarado) (10/25/90)
In article <1990Oct23.123025.18012@kodak.kodak.com> tiefel@sunshine.Kodak.COM (Lenny Tiefel) writes: >I have a main directory with hundreds of subdirectories, >and I want to find a file with a particular string, say "xyz" >The grep command only works in one directory at a time. Is there >a way of searching my whole directory structure to find a file >with a particular string? Try: find . -type f -exec grep -l string {} \; This will tell find to start from the current directory (.) and with each file only (-type f) execute a grep with the -l option for your string. The grep -l option will only display the filename if the string was found. -- ||| Pat Alvarado | v Teradata Corporation | tdat!pa1@suntzu.sun.com /\ /\ 100 N. Sepulveda Blvd. | uunet!edsews!hacgate!tdat!pa1 /// \\\ El Segundo, Calif. 90245 | pa1@tdat.teradata.com
robert@satori.sybase.com (Robert Garvey) (10/26/90)
In article <1990Oct23.143247.5639@lgc.com> max@lgc.com (Max Heffler) writes: >Try this: > > cd x where x is head of tree to search > find . -type f -exec grep xyz {} /dev/null \; | tee $HOME/xyz.out > >The /dev/null is to cause grep to report back the filenames where the string >is found, since grep will not report the filename for a single argument. I'd recommend adding the -l option to the grep command. Lines that contain the pattern are not output, only the name of the file(s) that have at least one instance. This option works as you'd like if grep is searching on a single file; its name is output. Robert Garvey Sybase, Inc robert@sybase.com 6475 Christie Ave {sun,lll-tis,pyramid,pacbell}!sybase!robert Emeryville, CA 94608-1010
a20@nikhefh.nikhef.nl (Marten Terpstra) (10/26/90)
In article <1990Oct23.123025.18012@kodak.kodak.com> tiefel@sunshine.Kodak.COM (Lenny Tiefel) writes: >I have a main directory with hundreds of subdirectories, >and I want to find a file with a particular string, say "xyz" >The grep command only works in one directory at a time. Is there >a way of searching my whole directory structure to find a file >with a particular string? A few months or so ago some people wanted to de the same thing. After a long debate of the best solutions this is the one that should work best : --start of script-- #! /bin/sh # Recgrep : a version of grep which recursively searches every directory # within current directory. # Usage : recgrep <string> find . -type f -exec grep $1 {} /dev/null \; --end of script-- Marten -- " Quidquid Latine Dictum Sit, Altum Viditur. " ( Whatever is said in Latin, sounds profound ) ------------------------------------------------------------------------------- Marten Terpstra National Institute for Nuclear Internet : terpstra@nikhef.nl and High Energy Physics Oldie-net: {...}mcsun!nikhefh!terpstra (NIKHEF-H) PO Box 1882, 1009 DB Phone-net: +31 20 592 5102 Amsterdam, The Netherlands -------------------------------------------------------------------------------
jon@jonlab.UUCP (Jon H. LaBadie) (10/26/90)
In article <2160@sixhub.UUCP>, davidsen@sixhub.UUCP (Wm E. Davidsen Jr) writes: > Lots of people have given you ideas on this, Indeed, but I've seen no one mention the use of the '-l' option to grep. The original poster wanted to locate the "files" that contained the pattern, not the "lines" containing the pattern. By using the -l option, grep will simply output a list of the file names. An added benefit is that grep will skip the rest of the current file once it finds a match. > ... I'll just add that if you > have a large number of non-text files in this structure, you can save > time by using the "file" command to identify the text files. > > Something like: > > $ find . -type f -print | xargs file | grep " text" | > > sed 's/:.*$//' | xargs grep PATTERN /dev/null > Just a comment specific to AIX 3.1. The common practice of looking for text files using file * | grep " text" does not work properly as some output messages contain the word "text" even though the file is a binary or data file. For example, one message I remember is "data or NLS text". -- Jon LaBadie {att, princeton, bcr, attmail!auxnj}!jonlab!jon
jay@gdx.UUCP (Jay A. Snyder) (10/30/90)
try: find ./ -name "*" -exec grep "xyz" '{}' \; the find command will execute any command in this manner. J -- ============================================================================== Jay A. Snyder "Let Me Up! I've had enough" wa3wbu!gdx!jay@uunet.uu.net uunet!wa3wbu!gdx!jay
pinard@IRO.UMontreal.CA (Francois Pinard) (10/31/90)
In article <64@gdx.UUCP> jay@gdx.UUCP (Jay A. Snyder) writes:
try:
find ./ -name "*" -exec grep "xyz" '{}' \;
the find command will execute any command in this manner.
I do not want to restart an overlong debate about xargs security :-).
If you have many files, use instead:
find ./ -print | xargs grep xyz
which is slighlty faster. xargs may quite often be used in
combination with `find -print' in this manner. The -print is not
required on the newer versions of find.
--
Franc,ois Pinard ``Vivement GNU!'' pinard@iro.umontreal.ca
(514) 588-4656 cp 886 L'Epiphanie (Qc) J0K 1J0 ...!uunet!iros1!pinard
evans@decvax.dec.com (Marc Evans) (10/31/90)
In article <64@gdx.UUCP>, jay@gdx.UUCP (Jay A. Snyder) writes: |> try: |> find ./ -name "*" -exec grep "xyz" '{}' \; |> |> the find command will execute any command in this manner. It would be more efficient to use: find . -exec grep "xyz" '{}' \; or if you have the xargs command: find . -print | xargs grep xyz - Marc -- =========================================================================== Marc Evans - WB1GRH - evans@decvax.DEC.COM | Synergytics (603)635-8876 Unix and X Software Contractor | 21 Hinds Ln, Pelham, NH 03076 ===========================================================================
ron@attcan.UUCP (Ron Joma) (11/01/90)
In article <658@llnl.LLNL.GOV>, rjshaw@ramius.llnl.gov (Robert Shaw) writes: > Question from Len Teifel: > | > | I have a main directory with hundreds of subdirectories, > | and I want to find a file with a particular string, say "xyz" > | The grep command only works in one directory at a time. Is there > | a way of searching my whole directory structure to find a file > | with a particular string? > | > try find ~ -name \*xyz\* -print > > Or, not quite as nice: ls -R ~ | grep xyz > I think that Len is looking for a pattern inside of the files, not in the file names. To find a keyword in a file, use the following pipe: find path -print | xargs grep "pattern" The find creates the path names, xargs converts these to arguements, thus forcing grep to look inside each file being passed along. ****************************************************************************** * Ronald Joma * I speak to the masses through the media, * * AT&T - Montreal * And if you have anything to say to me * * attcan!cmtl01!ron * You can say it with CASH! * ******************************************************************************
rbottin@atl.calstate.edu (11/03/90)
tiefel@sunshine.Kodak.com asked for a way to searach a structure for files that have a string. Here are some probable solutions: find some_directory -type f -exec grep string '{}' \; -print This is hard work and puts file names after thelines that match. for d in *;do grep string $d/*; done is bourne/Korn shell dependent and may be faster if ONLY one level needs to be searched (and has no directories). ls -Rf|while read f;do grep string $f && echo $f; done is faster than the 'find'....and there is a similar 'awk'ish solution ls...|awk '{system("grep string " $0)}' which fails to indicate which file the line occurs, but is kind of neat otherwise. The ultimate helper would be a little script called "hunter" perhaps : Usage: hunter string directory if [ $# -ne 2 ]; then echo Usage: hunter string directory; exit 1; fi for d in $2/* do if [ -d $d ] then hunter $1 $d elif grep $1 $d >/dev/null then echo $d; grep $1 $d # inefficient but avoids tmpfiles fi done : disclaimer - this is a quick hack and needs testing. Dick Botting CalState San Bernardino rbottin@atl.calstate.edu
rbj@uunet.UU.NET (Root Boy Jim Cottrell) (11/06/90)
In article <14225@smoke.BRL.MIL> gwyn@smoke.BRL.MIL (Doug Gwyn) writes: > find root_name -type f -print | xargs grep pattern /dev/null > >The extra argument to "grep" is to ensure that the filename will be >printed for each matching line. (Workaround for a wart in "grep"'s >design.) As usual, Doug has provided the correct answer, even tho he was by far not the first poster. I feel compelled only to add one small addendum: use the -l option of grep to print only the matching filenames. At the first match, we can quit this file. Note that this doesn't save much, and it is not clear from the question whether the filename only is wanted, or the line itself. We only save looking thru the files we found it in. >(If you don't have "xargs", complain to your vendor.) There is a public domain version in volume3 of comp.sources.unix. Others probably exist. But Doug is right, your vendor has no excuse not to have it. -- Root Boy Jim Cottrell <rbj@uunet.uu.net>
lwall@jpl-devvax.JPL.NASA.GOV (Larry Wall) (11/07/90)
In article <110982@uunet.UU.NET> rbj@uunet.UU.NET (Root Boy Jim Cottrell) writes: : In article <14225@smoke.BRL.MIL> gwyn@smoke.BRL.MIL (Doug Gwyn) writes: : >(If you don't have "xargs", complain to your vendor.) : : There is a public domain version in volume3 : of comp.sources.unix. Others probably exist. : : But Doug is right, your vendor has no excuse not to have it. ...apart from the fact that it's insecure, and is a total kludge to work around the inconsistencies of the Unix tool I/O interface. Other than that, it's great! :-) I might even use it someday. Larry Wall lwall@jpl-devvax.jpl.nasa.gov
greywolf@unisoft.UUCP (The Grey Wolf) (11/07/90)
In article <24925@adm.BRL.MIL> rbottin@atl.calstate.edu writes: >tiefel@sunshine.Kodak.com asked for a way to searach a structure >for files that have a string. Here are some probable solutions: > >The ultimate helper would be a little script called "hunter" perhaps >: Usage: hunter string directory >if [ $# -ne 2 ]; then echo Usage: hunter string directory; exit 1; fi >for d in $2/* >do > if [ -d $d ] > then hunter $1 $d > elif grep $1 $d >/dev/null > then echo $d; grep $1 $d # inefficient but avoids tmpfiles > fi >done >: disclaimer - this is a quick hack and needs testing. Disclaimer noted. Everyone seems to think that some sort of echo statement is needed for the filenames. Use grep $string $file /dev/null This way, the file containing the matching string is printed before the match. A portable way of doing this, though VERY inefficient is: find . -type f -exec grep $string '{}' /dev/null \; Shell solutions are trivial and left as an exercise to the imagination. :-) > >Dick Botting CalState San Bernardino >rbottin@atl.calstate.edu -- "This is *not* going to work!" "Well, why didn't you say so before?" "I *did* say so before!" ...!{ucbvax,acad,uunet,amdahl,pyramid}!unisoft!greywolf
mmoore%hellgate.utah.edu@cs.utah.edu (Michael Moore) (04/15/91)
Does anyone know if there is an easy way to recursively search for a pattern down the entire file tree of a directory? I have tried : grep -R pattern * grep -r pattern * grep -R pattern dirname grep -r pattern dirname I don't know what to do. If there were a way to recursively cat files or something similar, I could pipe that to grep, but I can't find that either. ANY help would be greatly appreciated. (P.S. Is this the proper newsgroup to post this question to?) ( If not, please let me know which one is. ) Michael Moore University of Utah <><><><><><><>><><><><><><><><><><><><><><> <>He spoke seldom, ate little, slept less<> <> -Ursula K. LeGuin <> <> *Wizard of Earthsea* <> <><><><<><><><><><><><><><><><><><><><><><>
akbloom@aplcen.apl.jhu.edu (Keith Bloom) (04/15/91)
mmoore%hellgate.utah.edu@cs.utah.edu (Michael Moore) writes: > Does anyone know if there is an easy way to recursively search for a >pattern down the entire file tree of a directory? If your system has xargs, you could try: find . -name '*' -print | xargs grep pattern If you have a huge directory tree with thousands of files in it, this may not work. If you don't have xargs, there's: find . -name '*' -print -exec grep pattern {} \; but this is more cumbersome, because it will print the names of all your files, whether they contain the pattern or not. (I assume you want to know the name of the file that 'pattern' is in.) In general, 'find' is usually your best bet for recursive operations like the one you have in mind. (PS: both methods work as stated under Ultrix. They ought to work the same on any reasonable Unix system, but there's no guarantee.)
a20@nikhefh.nikhef.nl (Marten Terpstra) (04/15/91)
mmoore%hellgate.utah.edu@cs.utah.edu (Michael Moore) writes: > Does anyone know if there is an easy way to recursively search for a >pattern down the entire file tree of a directory? A recursive version of grep has been discussed several times on the net. One of the solutions, and quite workable for me, is the following shell script : -- Start of script -- #! /bin/sh # Recgrep : a version of grep which recursively searches every directory and # file starting in the current directory. # # Usage: recgrep pattern if (test $# -lt 1) then echo "Usage: recgrep pattern" else find . -type f -exec grep $1 {} /dev/null \; fi -- End of script -- This will start in your current directory and recursively walk down all other dirs looking for you pattern. I know there are many other variations, that may work even better, but this one works just fine for me. -Marten -- Marten Terpstra National Institute for Nuclear Internet : terpstra@nikhef.nl and High Energy Physics Oldie-net: {....}mcsun!nikhefh!terpstra (NIKHEF-H), PO Box 41882, 1009 DB Phone : +31 20 592 5102 Amsterdam, The Netherlands
dbm@deltahp.nasa.gov (Brad Mears) (04/15/91)
In article <1991Apr14.214414.9815@hellgate.utah.edu>, mmoore%hellgate.utah.edu@cs.utah.edu (Michael Moore) writes: |> |> |> Does anyone know if there is an easy way to recursively search for a |> pattern down the entire file tree of a directory? Look at the -exec option on find. This executes the specified command for each file found. -- Brad Mears dbm@deltahp.jsc.nasa.gov ---------------------------------------------------------------------------- Opinions are expressly forbidden. | Definition of impiety : I speak for myself and no other. |"Noun, your irreverence toward my diety." | - Ambrose Bierce ----------------------------------------------------------------------------
jik@athena.mit.edu (Jonathan I. Kamens) (04/16/91)
In article <1991Apr14.214414.9815@hellgate.utah.edu>, mmoore%hellgate.utah.edu@cs.utah.edu (Michael Moore) writes: |> I don't know what to do. If there were a way to recursively cat |> files or something similar, I could pipe that to grep, but I can't find that |> either. find dirname -type f -exec grep pattern {} \; find dirname -type f -print | xargs grep pattern find dirname -type f -exec cat {} \; | grep pattern If there aren't a lot of files in the directory tree: grep pattern `find dirname -type f -print` The first solution mentioned will start up a grep process once for each file you want to search, so if there are a lot of files it will be slow. Therefore, if your system has xargs, and if there are no funky characters (such as newlines) in the filenames, the second solution is probably better. If you don't have xargs, there are several versions of it available on the net. The third solution mentioned probably has no advantages over the previous solutions so you probably shouldn't use it; I mention it just to answer your question about recursive cat'ing. The fourth solution is the best if there are few enough files in the tree that they will all fit in the maximum length of a command line. Quotining from question 3 of the "Frequently Asked Questions about Unix - with Answers [Monthly posting]" posting in this newsgroup, "`find' is a powerful program. Learn about it." It's a good idea to read that posting if you haven't already. If it has expired at your site, you can get a copy of it using the instructions at the end of this message. Oh, of course, you could do this in perl too, and perl would do the recursive file search and the grep all in one process. -- Jonathan Kamens USnail: MIT Project Athena 11 Ashford Terrace jik@Athena.MIT.EDU Allston, MA 02134 Office: 617-253-8085 Home: 617-782-0710 -- Subject: Frequently Asked Questions about Unix - with Answers [Monthly posting] Newsgroups: comp.unix.questions Available via anonymous ftp from pit-manager.mit.edu (18.72.1.58) in the file /pub/usenet/comp.unix.questions/Frequently_Asked_Questions_about_Unix_-_with_Answers_[Monthly_posting] Available from mail-server@pit-manager.mit.edu by sending a message containing send usenet/comp.unix.questions/Frequently_Asked_Questions_about_Unix_-_with_Answers_[Monthly_posting] Send a message containing "help" to get general information about the mail server.
jik@athena.mit.edu (Jonathan I. Kamens) (04/16/91)
In article <1991Apr15.042100.11727@aplcen.apl.jhu.edu>, akbloom@aplcen.apl.jhu.edu (Keith Bloom) writes: |> find . -name '*' -print | xargs grep pattern 1) The "-name '*'" is unnecessary. If you're checking for any name, you don't need to check the name at all. 2) This will try to search directories and special files as well, something that he probably doesn't want to do. Replace "-name '*'" with "-type f". |> If you have a huge directory tree with thousands of files in it, this |> may not work. Why not? The whole purpose of xargs is to take lots of arguments on standard input and run the desired program on as many as possible in each pass, running the program multiple times if necessary to get all of the arguments processed. |> find . -name '*' -print -exec grep pattern {} \; Once again, replace "-name '*'" with "-type f". -- Jonathan Kamens USnail: MIT Project Athena 11 Ashford Terrace jik@Athena.MIT.EDU Allston, MA 02134 Office: 617-253-8085 Home: 617-782-0710
sundrag@risky.Convergent.COM (Sundaraswaran Gopalakrishnan) (04/16/91)
In article <1991Apr14.214414.9815@hellgate.utah.edu>, mmoore%hellgate.utah.edu@cs.utah.edu (Michael Moore) writes: > > > Does anyone know if there is an easy way to recursively search for a > pattern down the entire file tree of a directory? > I have tried : grep -R pattern * > grep -r pattern * > grep -R pattern dirname > grep -r pattern dirname > > > I don't know what to do. If there were a way to recursively cat > files or something similar, I could pipe that to grep, but I can't find that > either. > > ANY help would be greatly appreciated. > > (P.S. Is this the proper newsgroup to post this question to?) > ( If not, please let me know which one is. ) > > > > Michael Moore > University of Utah > > <><><><><><><>><><><><><><><><><><><><><><> > <>He spoke seldom, ate little, slept less<> > <> -Ursula K. LeGuin <> > <> *Wizard of Earthsea* <> > <><><><<><><><><><><><><><><><><><><><><><> You can do the following : find / -print | xargs grep <pattern> or, find / -name "*.c" -print | xargs grep <pattern> xargs constructs the argument list from its input and 'll apply grep to each argument ( which is different from just "find / -print | grep <pattern>", which 'll look for a *file* that matches <pattern> ) Sundar, Unisys
subbarao@phoenix.Princeton.EDU (Kartik Subbarao) (04/16/91)
In article <1991Apr14.214414.9815@hellgate.utah.edu> rnelson%hell.utah.edu@cs.utah.edu writes: > > > Does anyone know if there is an easy way to recursively search for a >pattern down the entire file tree of a directory? > I have tried : grep -R pattern * > If you know the depth you want to search, you can say: grep pattern */* (i.e matches all files of all subdirectories) you can go as far as you want, i.e grep pattern */*/* Another way is to use find: find . -exec grep pattern "{}" \; or you could use xargs: find . -print | xargs grep pattern. -Kartik -- internet# rm `df | tail +2 | awk '{ printf "%s/quotas\n",$6}'` subbarao@phoenix.Princeton.EDU -| Internet kartik@silvertone.Princeton.EDU (NeXT mail) SUBBARAO@PUCC.BITNET - Bitnet
jcd@spock.att.com (Jack Dixon) (04/16/91)
From article <1207@nikhefh.nikhef.nl>, by a20@nikhefh.nikhef.nl (Marten Terpstra): > mmoore%hellgate.utah.edu@cs.utah.edu (Michael Moore) writes: > >> Does anyone know if there is an easy way to recursively search for a >>pattern down the entire file tree of a directory? > This is what I use: find dir -type f -print | xargs grep pattern -- -- Jack Dixon, AT&T Network Systems { ...!att!vogon!jcd, jcd@vogon.att.com }
marcus@illusion.uucp (Marcus Hall) (04/17/91)
In article <4037@risky.Convergent.COM> sundrag@risky.Convergent.COM (Sundaraswaran Gopalakrishnan) writes: >In article <1991Apr14.214414.9815@hellgate.utah.edu>, mmoore%hellgate.utah.edu@cs.utah.edu (Michael Moore) writes: >> Does anyone know if there is an easy way to recursively search for a >> pattern down the entire file tree of a directory? > >You can do the following : >find / -print | xargs grep <pattern> >or, >find / -name "*.c" -print | xargs grep <pattern> > >xargs constructs the argument list from its input and 'll >apply grep to each argument ( which is different from just >"find / -print | grep <pattern>", which 'll look for a *file* >that matches <pattern> ) > >Sundar, Unisys Unfortunately, grep causes a slight problem with this. The problem is that if grep has two or more file arguments it prints the file name and a colon before the matched lines. If it has a single file argument, it does not. Now, xargs collects some number of lines from stdin and gives them to grep as file arguments (or so grep interprets them). After potentially kicking off several greps, if there is only one line left on xarg's stdin, it will kick off a new grep with a single argument. If this grep matches any lines, it will not output the file name, just the matched lines. Thus, it will be unclear just where these lines came from! There is no option to force grep to output the file name, but one trick that can be used to do this is to always give grep an extra argument. If this is "/dev/null", then grep cannot possibly match anything in it, so it is effectively a noop, but it does force grep to output file names. Thus, the commands from above should be written as: find <dir>... -type f -print | xargs grep <pattern> /dev/null - or - find <dir>... -type f -name "*.c" -print | xargs grep <pattern> /dev/null (The "-type f" is there because you probably don't want to do grep through directories and especially you do not want to do it through /dev files!!) marcus hall
gwyn@smoke.brl.mil (Doug Gwyn) (04/17/91)
In article <8304@idunno.Princeton.EDU> subbarao@phoenix.Princeton.EDU (Kartik Subbarao) writes: >grep pattern */* (i.e matches all files of all subdirectories) Often this method will exceed the number of characters allowed for arguments. >find . -exec grep pattern "{}" \; >find . -print | xargs grep pattern. These will fail to print the file name (in the xargs case, it fails only occasionally). Basically the problem is nonuniform design of grep's semantics. To get reasonable behavior, try find . -print | xargs grep pattern /dev/null
alex@am.sublink.org (Alex Martelli) (04/17/91)
akbloom@aplcen.apl.jhu.edu (Keith Bloom) writes: :mmoore%hellgate.utah.edu@cs.utah.edu (Michael Moore) writes: :> Does anyone know if there is an easy way to recursively search for a :>pattern down the entire file tree of a directory? :If your system has xargs, you could try: :find . -name '*' -print | xargs grep pattern :If you have a huge directory tree with thousands of files in it, this :may not work. Why not, pray? xargs is supposed to chop its stdin into pieces that are short enough that they can be passed as arguments to the target command, here 'grep pattern'. A slight improvement: use "grep pattern /dev/null" as the target command; by making grep look into more than one file, it will print the name of the file where the pattern is found (in the original, if grep happened to be called with just one file, for example at the very end of the search process, it might find lines and print them out without identifying where they came from). A second improvement: omit the -name "*"; all it's doing is not making grep look into files whose names start with a dot; and why wouldn't you want to grep inside .netrc, for example? A third improvement: add a -type f flag; avoid grepping into directories by mistake, and particularly avoid grepping into device-files - grepping into /dev/tty, for example, can hang the procedure until EOF is forced on the terminal... There are many other things one might wish to do (for example, only grep into files which are readable by you), but find does not support them easily. Unfortunately, some grep's will just fail if ONE of their target files is unreadable - and not even bother looking into the other ones! The best fix for this specific problem is probably to also attack another desideratum - NOT grepping into non-text files. The "file" command, on many systems, will emit a description containing the keyword "text" for a text file (in variations such as "English text", "ascii text", etc), but not for non-text files (it will say "data", or describe the type of executable, etc), and for non-readable files it will say something like "cannot open for reading" [if you're unlucky enough that your "file" command says, for example, "sh commands" instead of "sh command text" for a shell script, you will have to get a little more fancy in the following, but the basic idea still apply). So, we want to xargs the files emitted by find, first into file, then remove all non-text ones, and finally grep on the remainder only; we can both select for "text", and remove the descriptions, at one gulp with, for example, sed. find . -type f -print | xargs file | sed -n '/:.*text/s/:.*//p' | xargs grep pattern /dev/null This is still NOT perfect - filenames containing newlines will typically give problems with any find ... -print | xargs (one should use find ... -print0 and matching xargs -0, if lucky enough to have them, for example GNU versions of find and xargs), and here the further trip through file and sed will further mess things up if the filename contains a colon (and is a text file, or has the string "text" in the filename after the colon); one COULD get fancier, with a sed expression to exclude lines with two or more colon characters, but it's getting a bit late at night for me to figure out how to handle a filename with such as "joke: ascii text\nfooled you!" even with the -print0 and -0... there is a point of diminishing return where perl gets simpler than this sort of thing...:-). :If you don't have xargs, there's: : :find . -name '*' -print -exec grep pattern {} \; : :but this is more cumbersome, because it will print the names of all :your files, whether they contain the pattern or not. (I assume you :want to know the name of the file that 'pattern' is in.) You can omit the -print and just have /dev/null as an argument to grep just after the pattern, as I suggested above. It's still "more cumbersome" in the sense of overloading your CPU, since a fork and exec is done for each file, rather than processing them en masse via xargs... still, my suggestions about removing the '-name *' and inserting a '-type f' would also apply here. -- Alex Martelli - (home snailmail:) v. Barontini 27, 40138 Bologna, ITALIA Email: (work:) martelli@cadlab.sublink.org, (home:) alex@am.sublink.org Phone: (work:) ++39 (51) 371099, (home:) ++39 (51) 250434; Fax: ++39 (51) 366964 (work only), Fidonet: 332/401.3 (home only).