On Aug 17, 2014, at 9:27 AM, Jeff King <[email protected]> wrote:
> On Sat, Aug 16, 2014 at 06:26:08PM +0200, Steffen Prohaska wrote:
>
>>> Is the 15MB limit supposed to be imposed somewhere or is it just a guide
>>> of how much memory we expect Git to use in this scenario?
>>
>> The test should confirm that the the file that is added is not mmapped
>> to memory. The process size should be relatively small independently
>> of the size of the file that is added. I wanted to keep the file size
>> small. The chosen sizes worked for me on Mac and Linux.
>
> Measuring memory usage seems inherently a bit flaky for the test suite.
> It's also a little out of place, as the test suite is generally about
> correctness and outcomes, and this is a performance issue.
For files >2GB on a 32-bit system (e.g. msysgit), filtering with the previous
code always failed. Now it works. I created the patch to change git from
'fundamentally doesn't handle this' to 'works as expected'.
> Would it make more sense to construct a t/perf test that shows off the
> change? I suppose the run-time change may not be that impressive, but it
> would be cool if t/perf could measure max memory use, too. Then we can
> just compare results between versions, which is enough to detect
> regressions.
I wasn't aware of t/perf. Thanks for suggesting this.
I agree that testing memory usage might be a bit flaky. t/perf might indeed be
a better place.
I'm not yet entirely convinced, though. I'm wondering whether the proposed
test would be robust enough with a large enough threshold to keep it in the
main test suite.
Steffen
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to [email protected]
More majordomo info at http://vger.kernel.org/majordomo-info.html