On 2022-06-02 at 20:00 +0000, Gergely wrote: > Well, the issue is not the fact that this is a resource exhaustion, > but rather the fact that it's entirely OS-dependent and the > programmer has zero control over it.
The programmer could have avoided creating an infinite source file recursion. I don't think it would be hard to add a SOURCENEST limit, but I find it would mostly help to catch programming errors, not as a normal exceptional case for a valid program that the programmer should be expecting. Can you provide an example where a sensible program would need to source a large chain of files sourcing one another or even recursively sourcing themselves? I cannot think of a good use case for that. Someone might choose to do "recursive file programming", but that would come with the need to avoid infinite recursion, somehow. For a normal programmer which simply has two library functions requiring one another, a simple "#ifndef X / #define X" pattern is easy to produce by adding at the top something like > if [ "$LIBRARY_FOO_INCLUDED" = 1 ]; then return; fi; > LIBRARY_FOO_INCLUDED=1. A case for a program that needs to create hundreds of files sourcing one another. And there is little you could do to recover other than abruptly terminating the program. It would just be able to provide a nicer error message. I don't think it's even a vulnerability since the number of included files would need to be controlled by the attacker (but only the *number* of sourced files, if he also controlled the contents, he would already be able to execute its own code). Note that the fact you may get the crash to happen on different functions doesn't really matter here, since it's really an out of memory. You may produce similar results by changing the value of ulimit -s Regards