processing a large number of files

Steve Holmes steve at holmesgrown.com
Tue Dec 17 07:45:45 EST 2002


I also have used the following construct in some recent scripts I've
written...
# Start of sample script
while [ "$1" != "" ]; do
      pdftotext $1
      shift
done
# End of sample script

The shift command will "shift" the command line arguments over so that
$1 will point to the next argument (file in this case) for each pass
of the loop.  It will stop when you run out of arguments.  Note that
the shell expands wild cards for you.

This gets more tricky if you have to specify multiple arguments for
the command or if you have to split out file extensions such as with
the lame encoder.  Then some more elaborate scripting will be
necessary but this is a good place to start from.

While I'm here, is there an actual maximum length for command lines in
the shell like bash?  I know DOS had limits like 1024 or 2048 or
something like that.

On Tue, Dec 10, 2002 at 09:05:11PM -0600, Gregory Nowak wrote:
> Hi all.
> 
> I have a large number of pdfs that I need to process into text files.
> 
> I know that if I feed file1.pdf to pdftotext, it will write file1.txt. So, I tried doing
> pdftotext *.pdf
> , but got the program's help message, instead of bash expanding the filenames, and a bunch of .txt files.
> 
> Since my bash programming is next to nothing, is there a way to do what I want without a script?
> 
> Could I somehow get bash to expand the wildcard, and feed the individual files to pdftotext for it to spit out txt files?
> 
> Thanks for any help in advance.
> 
> Greg
> 
> 
> _______________________________________________
> Speakup mailing list
> Speakup at braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup

-- 
Please avoid sending me Word or PowerPoint attachments.
   See http://www.fsf.org/philosophy/no-word-attachments.html




More information about the Speakup mailing list