João Eiras
2017-06-01 20:11:04 UTC
Hello !
Here's my problem. I have a script that calls an external program that
output its stuff in plain text. I'm piping the output to a process of mine
to compress it. Something like
long_command -o >(gzip -c > file.gz)
I do not want to touch stdout nor stderr as these can be clobbered by
status messages, and the correct way to receive the output is using the -o
parameter.
Immediately after the program finishes, I want to check that the produced
file is OK, regarding contents and whatnot. So something like.
long_command -o >(gzip -c > file.gz)
if gzip -dc file.gz | grep -q bad_line; then
echo some error
fi
That second like with the call to "gzip -dc" often returns an error:
gzip: abort: corrupted input -- invalid deflate data
This happens because the subprocess ">(gzip -c > file.gz)" has not had time
to finish and close the output file.
Question: How can I force bash to wait until all subprocesses to finish ?
'wait' does not work here as these are not background tasks managed by the
job control.
A simple testcase follows. "my_command_3 ended" should be printed before
"my_command_4 ran" is.
Thank you !
_________________________________________
#!/bin/bash
function my_command_1 {
echo my_command_1 started >&2
cat "$1" | sed 's/$/+2/'
echo my_command_1 ended >&2
}
function my_command_2 {
echo my_command_2 started >&2
echo 1
echo my_command_2 ended >&2
}
function my_command_3 {
echo my_command_3 started >&2
cat | sed 's/$/+3/'
sleep 1
echo my_command_3 ended >&2
}
function my_command_4 {
echo my_command_4 ran >&2
}
my_command_1 <(my_command_2) > >(my_command_3)
wait # Doesn't work?
my_command_4
sleep 1.1
Here's my problem. I have a script that calls an external program that
output its stuff in plain text. I'm piping the output to a process of mine
to compress it. Something like
long_command -o >(gzip -c > file.gz)
I do not want to touch stdout nor stderr as these can be clobbered by
status messages, and the correct way to receive the output is using the -o
parameter.
Immediately after the program finishes, I want to check that the produced
file is OK, regarding contents and whatnot. So something like.
long_command -o >(gzip -c > file.gz)
if gzip -dc file.gz | grep -q bad_line; then
echo some error
fi
That second like with the call to "gzip -dc" often returns an error:
gzip: abort: corrupted input -- invalid deflate data
This happens because the subprocess ">(gzip -c > file.gz)" has not had time
to finish and close the output file.
Question: How can I force bash to wait until all subprocesses to finish ?
'wait' does not work here as these are not background tasks managed by the
job control.
A simple testcase follows. "my_command_3 ended" should be printed before
"my_command_4 ran" is.
Thank you !
_________________________________________
#!/bin/bash
function my_command_1 {
echo my_command_1 started >&2
cat "$1" | sed 's/$/+2/'
echo my_command_1 ended >&2
}
function my_command_2 {
echo my_command_2 started >&2
echo 1
echo my_command_2 ended >&2
}
function my_command_3 {
echo my_command_3 started >&2
cat | sed 's/$/+3/'
sleep 1
echo my_command_3 ended >&2
}
function my_command_4 {
echo my_command_4 ran >&2
}
my_command_1 <(my_command_2) > >(my_command_3)
wait # Doesn't work?
my_command_4
sleep 1.1