This question is a fork of this SO question.
Here is the MCVE version:
$ PS1='Parent-$ '
Parent-$ type seq
seq is /usr/bin/seq
Parent-$ bash
$ PS1='Child-$ '
Child-$ for i in $(seq 1000000000); do echo $i; done
bash: xrealloc: .././subst.c:5273: cannot allocate 18446744071562067968 bytes (4299235328 bytes allocated)
Parent-$ seq: write error: Broken pipe
(I have changed the PS1 of parent & child just to easily differentiate between them.)
Essentially, the child bash receives an out-of-memory error while processing the seq command with large number.
This issues is obviously because seq is calculated first & that output is used as input for the for loop.
My question is, however: Why did it not hit MAX_ARG_STRLEN limit? Or is this indeed hitting that limit? But if that is the case, the fault should not be a OOM in bash... Right?
One possible reason is: because bash first calculates $(...)
& keep it in memory. After the evaluation is complete, then it forms the command line for actual command - for...
part. But before the first step is completed, it receives OOM error.
Let me know if this understanding is right.