Switch from file contents to STDIN in piped command? (Linux Shell)

I have a program (that I did not write) which is not designed to read in commands from a file. Entering commands on STDIN is pretty tedious, so I'd like to be able to automate it by writing the commands in a file for re-use. Trouble is, if the program hits EOF, it loops infinitely trying to read in the next command dropping an endless torrent of menu options on the screen.

What I'd like to be able to do is cat a file containing the commands into the program via a pipe, then use some sort of shell magic to have it switch from the file to STDIN when it hits the file's EOF.

Note: I've already considered using cat with the '-' for STDIN. Unfortunately (I didn't know this before), piped commands wait for the first program's output to terminate before starting the second program -- they do not run in parallel. If there's some way to get the programs to run in parallel with that kind of piping action, that would work!

Any thoughts? Thanks for any assistance!


I should note that my goal is not only to prevent the system from hitting the end of the commands file. I would like to be able to continue typing in commands from the keyboard when the file hits EOF.

13.10.2009 15:38:37

I would do something like

(cat your_file_with_commands; cat) | sh your_script

That way, when the file with commands is done, the second cat will feed your script with whatever you type on stdin afterwards.

14.10.2009 07:22:37
You can do that? I had no idea you could group commands. That's my something new for the day. This keeps me from having to use a second window -- thanks greatly, Idelic!
zslayton 14.10.2009 15:16:26
For those who are interested, you can build onto the file containing commands as you go by using: (cat your_file_with_commands; tee -a your_file_with_commands) | sh your_script which will append each command to the file in addition to passing it to the script via the pipe.
zslayton 14.10.2009 19:17:19

Have you tried using something like tail -f commandfile | command I think that should pipe the lines of the file to command without closing the file descriptor afterwards. Use -n to specify the number of lines to be piped if tail -f doesn't catch all of them.

13.10.2009 15:42:07
That's a good idea, but unfortunately doesn't let me continue to type in commands after the file has been completed. (I should have been clearer on that in my posting -- I'll edit it.) If you type 'tail -f commandfile -' to try and have it switch to STDIN after reading the file, it complains: warning: following standard input indefinitely is ineffective Thanks for the thought, though!
zslayton 13.10.2009 15:49:08
Have you tried appending lines to the file using echo something >> commandfile? tail -f should track those additions too.
Michiel Buddingh 13.10.2009 15:53:37
Ha! I was just trying that. I'll comment again if it works. (The program takes a while to get going.)
zslayton 13.10.2009 15:59:22
Success! I have one window running 'tail -n $NUM_LINES_IN_FILE -f $FILENAME | $PROGRAM_NAME' and another running 'cat >> $FILENAME' so I can continuously enter commands. Works like a champ! Good call, Michiel!
zslayton 13.10.2009 16:12:18

I would think expect would work for this.

13.10.2009 16:37:23

Same as Idelic answer with more simple syntax ;)

cat your_file_with_commands - | sh your_script
4.06.2015 00:29:53